- Joined
- May 15, 2017
- Messages
- 982
- Likes
- 760
- Points
- 1,045
Google uses its “spider” FeedFetcher to cache any content in Google Spreadsheet, inserted through the formula = image (“link”).
https://docs.google.com/spreadsheet/
For example, if you insert the formula
Code into one of the cells in the table :
Google will send the feedFetcher spider to download this image and cache it for further display in the table.
However, if you add a random parameter to the image URL, FeedFetcher will download it again each time. Say, for example, the victim’s website has a 10 MB PDF file. Inserting such a list into a table will cause the Google spider to download the same file 1000 times!
Code:
All this can lead to the exhaustion of the limit of traffic for some site owners. Anyone using only a browser with one open tab can launch a massive HTTP GET FLOOD attack on any web server.
The attacker doesn't even have to have a fast channel. Since the formula uses a link to a PDF file (that is, not a picture that could be displayed in a table), the attacker receives only N / A from the Google server in response. This allows you to quite simply multiply the attack [Analog DNS and NTP Amplification - approx. translator] that represents a serious threat.
Using a single laptop with several open tabs, simply copying and pasting lists of links to files of 10 MB each, Google Spider can download this file at a speed of more than 700 Mbps.
In my case, it lasted for 30-45 minutes, until I cut down the server. If I calculated everything correctly, it took about 240GB of traffic in 45 minutes.
https://docs.google.com/spreadsheet/
For example, if you insert the formula
Code into one of the cells in the table :
Code:
=image("http://example.com/image.jpg")
Google will send the feedFetcher spider to download this image and cache it for further display in the table.
However, if you add a random parameter to the image URL, FeedFetcher will download it again each time. Say, for example, the victim’s website has a 10 MB PDF file. Inserting such a list into a table will cause the Google spider to download the same file 1000 times!
Code:
Code:
=image("http://targetname/file.pdf?r=1")
=image("http://targetname/file.pdf?r=2")
=image("http://targetname/file.pdf?r=3")
=image("http://targetname/file.pdf?r=4")
...
=image("http://targetname/file.pdf?r=1000")
All this can lead to the exhaustion of the limit of traffic for some site owners. Anyone using only a browser with one open tab can launch a massive HTTP GET FLOOD attack on any web server.
The attacker doesn't even have to have a fast channel. Since the formula uses a link to a PDF file (that is, not a picture that could be displayed in a table), the attacker receives only N / A from the Google server in response. This allows you to quite simply multiply the attack [Analog DNS and NTP Amplification - approx. translator] that represents a serious threat.
Using a single laptop with several open tabs, simply copying and pasting lists of links to files of 10 MB each, Google Spider can download this file at a speed of more than 700 Mbps.
In my case, it lasted for 30-45 minutes, until I cut down the server. If I calculated everything correctly, it took about 240GB of traffic in 45 minutes.