It isn’t just seedy websites putting browsers at risk anymore: A new report out today shows how the state of the Web today has been rocked by the increasingly toxic combination of dynamic content and the use of third-party data sources to serve up that active content.
Last year Menlo started to get its arms around quantitative numbers to describe that risk and found that 42% of the Alexa Top 100,000 were serving up risky content or were vulnerable to compromise.
Researchers from the firm followed up on that today with their State of the Web First Half 2018 report. In it they examined Web risk based on the top 50 sites for six major countries worldwide. The study offered up statistics to illustrate the key risk factors of how the Web runs today. Top among those is how much active content from third parties like content delivery networks (CDNs) and ad delivery networks are pushed out to the user every time they visit a site.
“When a user clicks on a Web link to open a website, they are really opening not just a single website, but at least 25 websites at one time,” the report explains. “If any of these background sites are themselves risky, they could be used by cyberattackers to compromise the site being visited.”
The dynamic nature of most sites today is extremely high. For most countries studied, the average number of scripts executed per website was between 41 and 42. In the US, some top sites used as many as 160 scripts from 40 different background sites. As the report explains, these scripts are usually legitimately used by developers to improve user experience.
But the more scripts used and the more sources they come from, the broader the attack surface. The bad guys love those scripts because they’re perfect for delivering attacks like iFrame redirects and malvertising links.
The study showed that as many as 46% of the top sites in France are serving active code from risky background sites, followed by 32% of the top UK sites. In the US, that proportion was a bit lower—18% of the Alexa Top 50 sites contain active code served from risky background sites—but that’s still a statistically significant chunk of what most people would consider to be legitimate sites.
In addition to these risk factors, a number of the top sites worldwide exacerbate things by running their Web properties on vulnerable software. Approximately 8% of US sites, 10% of UK sites, and 20% of France sites were running on outdated platforms.
“I always hammer on websites running old code because of how prevalent that is on the Internet and how it continues to be a big source of malware risks,” Guruswamy says.
The takeaway for security leaders, he says, is to consider how well categorization-based security or URL filtering is protecting their users online today because they don’t cover these threats coming from sites that would otherwise be deemed legitimate and safe.
Learn from the industry’s most knowledgeable CISOs and IT security experts in a setting that is conducive to interaction and conversation. Early bird rate ends August 31. Click for more info.
Ericka Chickowski specializes in coverage of information technology and business innovation. She has focused on information security for the better part of a decade and regularly writes about the security industry as a contributor to Dark Reading. View Full Bio