Info: This is a little different post than you see everyday at WRD. It is a concern that popped up while working on "speeding up stuff" and think/hope it is worth discussing. Please share what you think.
I'm usually a fan of hosting all the files used in websites myself, under the same location with the website itself. When an image or JS file needs to be updated, no need to update it from a remote URL but just change the file hosted under the same website/FTP account.
However, this is not how things work the fastest. In order to speed up websites by distributing requests to multiple hosts and serving them from the fastest location to the end users, keeping stuff in CDNs (content delivery networks) is a very good and widely used solution.
Today, if we have used jQuery, MooTools, Dojo, Prototype, etc. while developing our websites (almost every website uses one of them -including many WordPress, Joomla, Drupal themes-), there is a high chance that we are calling these frameworks from Google Libraries API.
So, what happens if Google Libraries API gets hacked?
To be more specific, what if the contents of
https://ajax.googleapis.com/ajax/libs/jquery/ver.../jquery.min.js is changed?
I simply can't think of the damage it can create.
Btw, I'm aware that Google Libraries API is built with very good intentions and it does the job perfectly (thanks to them) and sure that Google's CDN is probably one of the safest places on the web. But, this security concern is worth discussing considering the effect it can create and every datacenter>server>data can possibly be hacked.
So, is this structure totally wrong or benefits are worth the thread? What do you think (really wondering here)?
Credits: Hack The Planet visual.