Back to Posts

Smart Cookies

Most of us spend too much time on social media and when you'll spend long enough on Facebook you'll start noticing patterns in the shared articles.

Beside the click-bait articles and the recent storm of alt-truth, there are some other content recurrences worth noticing.

One that caught my interest over time goes on the lines of

Science Says There Five Things Prove You're Smart

Intelligent people are more easily distracted at work, study claims

Intelligent people tend to be messy, stay awake longer, and swear more

Black humour is sign of high intelligence, study suggests

The first thing to notice is that when a generalist newspaper tries to legitimate something claming "science says" or "study suggests", it's usually bullsh... imprecise to say the least. The second important aspect is that no one is immune to these kind of articles, no matter how big or renowned: if you didn't bother to click the images, they were taken on Forbes, The Telegraph, The Independent and The Guardian.

We all like to think we are clever, don't we?

I use semicolons; they make me feel smart
Funnily enough, this applies to Javascript too,?

The idea behind Smart Cookie was to take the pis... collect an anthology of articles of such kind.

Considering the type of website I wanted to make, not really challenging from a technical perspective, I wanted to focus on performance.

The app itself is a series of Smart Cookies are [{{description}}](link to article) but [also](link to next)... on a background resembling the {{description}}.

Smart Cookies are Charming

The first thing when it comes to performance is to give yourself goals.

I wanted

Progressive enhancement

First things first: I started with server rendering. Not only it's arguably faster than serving an empty page and then rendering client-side, but also because I wanted to be able to have a nice sharing experience.

Facebook Share

I chose Node.js mainly because I'm comfortable with it: a project this small didn't quite have a lot of requirements.

Once the server provided the first rendering, I needed to style it. I wanted a custom font, but that wasn't really a critical thing. I could go for an async loading with webfont loader but given the size of my css, I preferred to subset only the characters I would use (Uppercase Latin characters) with Font Squirrel and inline it in base64 with one of the most broadly supported format, Woff (Web Open Font Format).

Can I Use woff? Data on support for the woff feature across the major browsers from caniuse.com.

Even with the font inlined

@font-face {
    font-family: 'bloxxx';
    src: url(data:application/font-woff;charset=utf-8;base64,d09GRgABAAAAABy...)
}

the gzipped css was only 8.5KB, not bad at all.

The next step was to progressively enhance the navigation with Javascript: I made the server inject all the JSON data I was consuming on the backend in a script with type application/json which I could then read

var data = JSON.parse( document.getElementById( 'data' ).innerHTML );  

With that data, I could randomise the available articles, and preload the next image.

function preloadImg( next ) {  
  (new Image()).src = '/images/' + next + '.jpg';
}

If the user clicked the "also..." link, they had an immediate transition to the next view.

Being server rendered, and being the shareable URL fairly important to me, I added also the push state navigation.

Now the experience was matching my expectation (this is me clicking on "also..." with cold cache and I promise the GIF is at natural speed):

Wooooosh

Pagespeed

Then it came the tricky part.

PageSpeed is an hard client.

The first problem I encountered was the blocking css. I decided to inline it too: for the happy path, most of the navigation happens on the client and hence the page is loaded only once. And even for edge cases falling back to the server a 8.5KB footprint wasn't too much of an hassle to be loaded with the HTML.

Once I solved that, I had to deal with the images optimisation. And that's where stuff got serious. I tried all the tools providing lossless optimisation I knew , but all of them — JpegMini, Optimizilla, TinyJpeg, ImageOptim, which is a mac app offering a GUI for optimising tools such has JPEGOptim and Jpegtran — failed to meet the required standard. I actually tried also ImageOptim beta, which included the new Google algorithm Guetzli without succeeding.

I started growing frustrated, but then I found this in the PageSpeed documentation:

GIF, PNG, and JPEG formats make 96% of the entire Internet’s image traffic. Because of their popularity, PageSpeed Insights provides specific optimization recommendations. For your convenience, you can download the optimized images directly from PageSpeed Insights

and I decided to create and end point displaying all the images and letting PageSpeed do the hard work for me.

At this point I got 99/100 on both mobile and desktop. The only left error was:

Leverage browser caching for the following cacheable resources:

https://ssl.google-analytics.com/ga.js (2 hours)  

but given that I inject the GA scripts automatically via Cloudflare (asynchronously), that it is really convenient to do that and that most of the user will get the GA script from another website, as spread as the service is, I decided not to serve the script from my domain (which, as far as I know, is the only way to match that requirement).

It's really important to know when to stop.

Pixel pusher, javascript-something, front end developer since able to grow a beard, father of two, meteoropathic and with an insane passion for lo-fi music.

Read Next

Obey, kids!