Articles Tagged ‘performance’

7 Key take-aways from Chrome Dev Summit 2019

I was luck enough to go to the Chrome Dev Summit in San Francisco this year. There were a thousand people in the venue, and seven thousand on the wait list, so I’m glad that I could attend in person.

You can see all the videos from the talks at CDS 2019 on this playlist

Here are my 7 key take-aways from the event

The web must get to parity with native

Without saying so explicitly, the web must win and the Chrome team are working towards a future where the web is the go-to development platform for everything. It’s going gangbusters on desktop, but on mobile, where the user population are moving towards, it’s not the same story.

So, the team is investing heavily in things like project fugu, to create APIs for things like contacts, native file system interaction, bluetooth and NFC. This all has to be done through the standards process, which can be a long road, but it’s necessary to ensure interoperability in the web.

However, there’s still a long way to go to get the web into the Play/App store

The Chrome leadership panel session threw up a point that both Samsung and Microsoft treat PWAs as first-class citizens in their stores. Google has come out with TWAs – Trusted Web Apps. There is a build and submit step for the store as well, and the tooling provided only works on MacOS right now.


Take a step back from the specific problem and you can see that no one on the stage was happy with the outcome (note – the video of that session is not available online and the full live streams have been taken down). They all want the web to be available, and be discoverable. There is a lot more to be done and it’s not perfect, but they have taken a big step forward with TWAs from having no PWAs in the store, to a pretty simple way to create them and deploy them.
Chrome Leadership Panel at CDS 2019
It is going to be OK everyone, it’s just going to take some more time.

There’s a big focus on making the web faster

The speakers all talked about making the web faster, including how they measure this, with upcoming changes to how Lighthouse is going to incorporate new metrics such as Largest Contentful Paint (LCP) and Cumulative Layout Shift (CLS). All these metrics are explained over at, and they are now being labelled in the performance tab of the web inspector.
New metrics coming to Lighthouse in V6
They are also experimenting with slow and fast loading indicators and interstitials though they didn’t make that much of this during the talk – expect this to change a lot.

Lighthouse is getting a CI mode and a server – and I couldn’t be happier.

The focus on a faster web extended to making React faster

I was surprised at how much focus there was on React, given that the Web Almanac also launched at this event and it showed that React has just a 4.6% share of the web – though it clearly has a much larger mindshare amongst developers.

During one talk the speakers seemed to recommend that you focus on “UI frameworks” not “view libraries”, so Next.js over React. I don’t really understand why Google would go all-out in saying do one over the other, and especially the one that needs the most effort to configure to get high performance.

Google told us to focus on those less privileged than ourselves

A lot of talks covered performance, security, and access to data and services for those that can’t rely on it – for example, the developing world. They encouraged us to test on devices worse than ours, to look at our data, and to find out what the world is really like outside of our bubbles.

Chrome is secure by default

We’ve moved on from the pre-HTTPS era where users had to be rewarded for being safe with green labels and checkboxes – we are now in to the part where that is taken for granted and you are warned when things might not be as they seem. The security team are gathering data and doing in-depth research to find out what signals users react to and take action when something is insecure.

Also in security, there was a great talk on WebAuthn, a way to authenticate your users with minimal or no passwords using the capabilities of your devices or things like Titan security keys.

The open standards process is important, and bringing Microsoft in to Chromium is good for the web

It was fantastic to see the Edge team present, and show off all the things they were working on in collaboration with the Chrome team. Microsoft are bringing the best of what they know from accessibility, security and the Windows platform to the table, which will enhance the whole platform.
Open source contributors to Chromium
I’m also quite confident that we can stop developing for IE very soon. Once Edge launches, businesses adopt it and the IE icon goes away, we’ll see big drop-offs in IE usage.

And that’s it – the event was great and excellently organised. Well done to the MC’s as well and everyone behind the scenes. Roll on 2020!

Data-Driven Performance Breakout at Edge Conference

I was lucky enough to attend Edge Conf in London this year, a day that I always truly enjoy. The main sessions of the conference were streamed live and videos will be available later, but the break-outs weren’t recorded. These were the sessions I enjoyed the most and it’s a shame that people won’t see them without being there – so here’s my notes on what was said to the best of my ability (and with a big hat tip to George Crawford for his notes). Patrick Kettner was the moderator.

Q: How can we use the masses of data that RUM collects to get businesses to care about performance?

Business leaders like metrics from companies that they can relate to (i.e. Amazon, eBay) but these aren’t very useful metrics as the scale is completely different. Finding stats from competing or relevant companies is hard, so how do you make them care?

Introducing artificial slowness is one way to convince people, but not good for business. There’s also the risk that you may not see increase in conversion from speed improvements! Filmstrips are incredibly useful at this point to see what’s going on and these are available in Chrome Dev tools in the super secret area.

Showing videos to business people makes it really hit home – people hate it when they can visibly see their site suck. It’s like making people watch a user test for their site. Shout out to Lara Hogan at Etsy (their engineering blog is awesome) for their great work on this, something that Yell has copied.

Metrics that are useful: first render, SpeedIndex, aren’t available in the browser. Using SpeedCurve can really make business people sit up and take notice of performance because it’s a pretty interface to those things.

All-in-all, the standard metrics are unlikely to be the best for you, so add in user timing markings (and a very simple polyfill) and graph those, including sending them to WebPageTest so you can measure the things that are important to you over time. This was done very successfully by The Guardian (hat tip Patrick Hamann).

Q from Ilya Grigorik: The browser loading bar is a lie, yet users wait for it. What metric should it use?

Basically, developers can put their loading after the onLoad event to hack around the loading spinner. If we stop the spinner at first render, it’s not usable. If we stop it at when the page can be interacted with when would that be? The browser runs the risk of “feeling slower” or “feeling faster” by just changing the progress bar. Apparently there’s one browser that just shows the bar for three seconds, nothing more.

No real consensus was reached here, but it was a very interesting discussion

Q: Flaky or dropped connections are important to know about for performance metrics – what can the room say about their experiences gathering offline metrics?

When the FT tried this with their web app they often exceeded localStorage sizes and sometimes POST sizes (25MB) as users could be offline for a week or more. The Guardian had good success with bundling beacons up into one big post to save money with Adobe Omniture/SiteCatalyst.

The best solution is the Beacon API (sendBeacon) which promises to deliver the payload at some point (which images/XHR don’t right now). It’s implemented in Google Analytics, you just have to enable it in the config, other tracking providers don’t have it right now.

Q: What metrics APIs are missing in browsers?

A unique opportunity to ask Ilya to add APIs into Chrome – not to be passed up

  • Frame Timing API – requested as an ES7 observable (which is unlikely).
  • Performance Observer – a subscribable stream of events that will need processing to be useful. This will give accurate frame-rate
  • Network error logging API – could work like an error reporter that posts to a configurable second origin (via a header like CSP)
  • JavaScript runtime errors without hacking window.onError
  • SpeedIndex, or a proxy for it. There’s a script for this already but it’s not massively accurate. Standardising SpeedIndex would be great.
  • First Paint – according to Ilya it’s not possible and quite subjective browser-to-browser


I’d have loved to stay and chat more (nice to meet Tim Kadlec in person, shout out to the Path to Performance podcast as well), it’s rare to have a lot of the web performance community in the same room at the same time and should definitely happen more often.

If there’s things I’ve missed, let me know in the comments or on twitter (@steveworkman)

Going jQuery-free

It’s 2014 and I’m feeling inspired to change my ways. In 2014, I want to go jQuery-free!

Before I get started, I want to clear the air and put in a big fat disclaimer about my opinions on jQuery. Here we go:

jQuery is an excellent library. It is the right answer for the vast majority of websites and developers and is still the best way to do cross-browser JavaScript. What I am against is the notion of that jQuery is the answer to all JavaScript problems.

Lovely, now that’s done, this is why I want to do it. Firstly, as lots of people know, jQuery is quite a weighty library considering what it does. Coming in at 32KB for version 2.x and around 40KB for the IE-compatible (gzipped and minified), it’s a significant chunk of page weight before you’ve even started using it. There are alternatives that support the majority of its functions in the same API, such as Zepto, but even that comes in at around 15KB for the most recent version, and can grow larger. The worst thing for me, is that I don’t use half of the library, all I really do is select elements, use event handlers and delegation, show/hide things and change CSS classes. So, I want a library of utility functions that only does these things.

Word to the wise, this is not a new notion, and follows on very nicely from the work that Remy Sharp has done in this area in his min.js library.

I’m going to write a series of posts as I attempt to separate myself from jQuery, and make my websites leaner and faster. The first of which will be on “what you think you need, and what you actually need” and give you ways to work out if this approach is for you, or if you should be sticking with jQuery. Next, I’ll cover the basics of what a minimalist jQuery library; and finally I’ll cover strategies for dealing with unsupported browsers.

Let me know if there’s anything in particular you want me to cover, and I’ll do my best to go over it for you.