The Rise of Chrome: A Technical Comparison with Competing Browsers

Slide Note
Embed
Share

Since 2009, Google Chrome has become the dominant desktop browser, outpacing competitors like Internet Explorer and Firefox. This study delves into the technical aspects that have contributed to Chrome's rise, including performance benchmarks, features, and adherence to industry standards. The methodology involved measuring various browser versions to assess performance across different tasks. The data collected highlights Chrome's superiority in key areas, shedding light on why it has surpassed its rivals in the browser market.


Uploaded on Sep 21, 2024 | 0 Views


Download Presentation

Please find below an Image/Link to download the presentation.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. Download presentation by click this link. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.

E N D

Presentation Transcript


  1. The rise of Chrome Jonathan Tamary and Dror G. Feitelson PeerJ Computer Science 1:e28, Oct 2015

  2. The Question Since 2009 Chrome has come to dominate the desktop browsers market How has it done this? Specifically, is it technically better than the competition? Implication: is its rise different from that of Internet Explorer?

  3. Technically Better Better performance Measure using common industry benchmarks Add measurements of startup time Better conformance Measure using common industry benchmarks Better features Release features earlier Focus on features that users consider important

  4. The Competition Microsoft Internet Explorer 8-11 Previous dominant browser Bundled with Windows 98 , 76% in 1999 Antitrust case 1997-2002 Mozilla Firefox 3-26 Branched from original Netscape code Tried to compete with Explorer Reached 30% in 2009 Google Chrome 1-31

  5. PERFORMANCE

  6. Performance Benchmarks Benchmark Content Response SunSpider Javascript tasks Time BrowserMark General browser performance Score CanvasMark <canvas> tag Score PeaceKeeper Javascript tasks Score Startup times Cold startup time Time

  7. Methodology Download and measure all browser versions Some did not work with some benchmarks Perform measurements on a dedicated machine Use Win7 32-bit for versions till may 2011, 64-bit for later versions Results on the two systems were consistent Repeat measurements and calculate standard error

  8. Data Collection Benchmark Rep Missing Explorer Firefox Chrome SunSpider 3 30, 31 BrowserMark 3 8 3, 3.5, 3.6 1 CanvasMark 3 8 3 1, 2, 3 PeaceKeeper 3 1, 2, 3, 4 Startup times 20

  9. SunSpider Developed by WebKit Measure core Javascript performance: tasks rather than microbenchmarks Debate whether representative Explorer improvement attributed to dead-code elimination, perhaps specifically to improve SunSpider performance

  10. BrowserMark 2.0 Developed by Rightware, originally for mobile and embedded Multiple aspects of general browser perofrmance Page load/resize WebGL HTML5, CSS3 Canvas

  11. CanvasMark 2013 Test HTML5 <canvas> tag: container for graphics drawn with Javascript Stress test using Bitmap operations Alpha blending Shadows Text Goal: stay > 30fps

  12. PeaceKeeper Developed by FutureMark Focus on Javascript use <canvas> DOM tree ops Parsing Video formats multithreading

  13. Startup Times At boot run script to launch browser 1 min later Take timestamp just before launching and pass to browser via URL parameter Browser loads page that takes another timestamp and sends to server Difference between timestamps is startup time

  14. CONFORMANCE

  15. Conformance Benchmarks Benchmark Content Response HTML5 compliance CSS3 test HTML standard Score CSS standard Score Browserscope security Security enhancing features Tests passed

  16. HTML5 Compliance The language to describe web pages HTML5 introduced in 2008 and approved 2014 Benchmark checks supported features

  17. CSS3 Test The language to describe web page style Check recognized elements of CSS spec Does not check wuality of implementation

  18. Browserscope Security Community- driven project to profile prowsers Checks support for Javascript APIs for safe interactions Score is tests passed

  19. Other Browsers We compared the 3 top browsers There are others Opera Safari Why did Chrome gain market share while they did not? If they are just as good technically, then Chrome s advantage is only in marketing

  20. Opera Main contender in Windows desktop market Opera is technically inferior to Chrome

  21. FEATURES

  22. Feature Selection Start with 43 features of modern browsers Remove 11 that were included in Chrome 1 and already existed in Firefox and Explorer Tabs, history manager, pop-up blocking, Remove 7 that were introduced at about the same time by all 3 browsers Private browsing, full screen, Use 25 remaining features

  23. Add-ons manager Multiple users Download manager Apps Auto-updater Caret navigation Developer tools Personalized new tab Pinned sites Click-to-play Sync Print preview Session restore (automatically) Crash/security protection Per-site security configuration Web translation Malware protection Spell checking Outdated plugin detection Do not track Built-in PDF viewer Sandboxing Themes RSS reader Experimental features

  24. Competitive Advantage Feature should be released ahead of competition by a meaningful margin Our definition: more than one release cycle Gives advantage to browsers with slow release (Internet Explorer)

  25. Wins and Losses A browser wins if it releases a feature ahead of all the competition A browser loses if it releases a feature behind all the competition Each feature can have at most one winner and one loser 7 features with no winner or loser were removed

  26. Wins and Losses Wins 6 5 - Losses 5 6 13 Chrome Firefox Internet Explorer

  27. Feature Importance Survey Online survey 254 participants HUJI CS facebook page TAU CS facebook page Reddit.com/r/SampleSize Rank each of 25 features on a 5-point scale 1 = least important 5 = most important Use a relative scale (compare features to each other)

  28. Analysis Questions Are features that Chrome won more important than those that Firefox won? Are features that Chrome lost less important than those that the other two browsers lost? Are features that Chrome won more important than those it lost? Need to compare scores for sets of features

  29. Statistical Analysis Conventional approach: Find average importance grade of features in each set Use statistical test for significance This is wrong! Scores do not come from a valid interval scale 1 2 is not necessarily the same as 3 4 Averaging is meaningless

  30. Statistical Analysis Use methodology developed to compare brands Brand A is superior to B if distribution of opinions about A dominates distribution of opinions about B in the stochastic order sense In plain English: the distribution is skewed toward higher scores (the CDF is lower) In our case, brands are sets of features (not Microsoft, Google, Mozilla)

  31. Statistical Analysis The problem: neither distribution may dominate the other (the CDFs cross) Solution procedure: 1. Identify homogeneous brands with clustering. These are brands that cannot be distinguished. 2. Find widest collapsed scale. Collapsing unites adjacent score levels to achieve dominance, but we want to keep as many levels as possible. 3. Verify that resulting dominance is significant. Due to Yakir & Gilula, 1998

  32. Results for Wins Rank Browser Wins Importance scores distribution 1 2 3 4 5 1 Chrome 6 0.16 0.20 0.24 0.23 0.17 2 3 Firefox Internet Explorer 5 0 0.27 -- 0.22 -- 0.23 -- 0.20 -- 0.09 --

  33. Results for Losses Rank Browser Losses Importance scores distribution 1 2 3-4 5 1 Firefox 6 0.17 0.16 0.45 0.22 Internet Explorer 13 2 Chrome 5 0.18 0.16 0.44 0.22

  34. Results for Chrome Rank Class Features Importance scores distribution 1-2 3 4 5 1 Losses 5 0.33 0.18 0.27 0.22 2 Wins 6 0.36 0.24 0.23 0.17

  35. END GAME

  36. Summary of Results Benchmark Result SunSpider Chrome was best through 2010, now Internet Explorer is significantly better BrowserMark 2.0 CanvasMark 2013 Chrome is best, Explorer worst Chrome is relatively good but inconsistent, Firefox worst PeaceKeeper Chrome is significantly better Start-up times initially Chrome was better but now Firefox is better, Explorer has deteriorated HTML5 Compliance Chrome is better, Explorer worst CSS3 Test Browserscope Security Chrome is better Chrome is better, Firefox worst

  37. Summary of Results Crome won on 6 features and lost on 5 Firefox won on 5 features and lost on 6 Internet Explorer lost on 13 features Chrome s wins were more important than Firefox s wins The losses of all browsers were equally important Chrome s losses were slightly more important than its wins

  38. Implications Internet explorer is proprietary Firefox is open source More innovative and better than product from leading software firm Chrome is related to open-source Chromium But how open is it? Main factor apparently company/organization and not development style Firefox & Chrome also moved to rapid releases Slow releases could contribute to Explorer s demise

  39. Threats to Validity How to measure market share netmarketshare.com claims Explorer dominates Focus on technical aspects Ignores marketing campaign and Google brand Didn t check all smaller browsers If better than Chrome then marketing was decisive Used benchmarks do not cover all aspects (and not clear exactly what they do) But writing new ones suffers no lesser threats

  40. Conclusions Chrome s rise is consistent with technical superiority But it is not necessarily the result of technical superiority The Google brand name and the marketing campaign may have had significant effect

Related