I nerded out in this post. Its longer than most of my posts and features rather a lot of code examples. Hope that's cool.
Did you know that the UK government website has a secret stats page for each of its site pages? If you put /info
before a route, it will show you the stats page for any given page. As an example, if we wanted to look at the stats for the government's page on universal credit, we could type https://www.gov.uk/info/universal-credit
. You can read more about this here.
I have always admired the UX of these stats pages. They seem to convey a large amount of data without it being overwhelming to the user. This is what I wanted for my stats page and was the source of much inspiration when building it.
Data Collection
My site collects data in a variety of different ways, I have site tracking, a database, code statistics, real-time connections and data from external sources. But we can't pull from them without setting them up so they can collect data first.
Site Tracking
Site Tracking is a powerful tool. But I think if you're going to track your users, you should only take the minimal amount you need to to make informed decisions. Previous versions of this site used google analytics to track users. I have since switched to fathom.
It should be noted that there are plenty of alternatives to Fathom. As the tool I use, some of this article is geared towards it.
Adding Fathom Analytics
Like everything I seem to want to add to my site - "there's a plugin for that". I added gatsby-plugin-google-analytics
by following these steps:
-
Sign up to Fathom and acquire a Site ID.
-
Install the plugin:
Copied!1.npm install gatsby-plugin-fathom -
Add it to your Gatsby config:
gatsby-config.jsCopied!1.module.exports = {2.plugins: [3.{4.resolve: `gatsby-plugin-fathom`,5.options: {6.siteId: "Your-Site-ID",7.},8.},9.],10.};1Be sure to set your Site ID as a plugin option.
If you start you local development server, you won't notice a difference. The plugin is disabled in development, this ensures that you don't accidentally track yourself while working away on your site. But if you want to test it out you can always gatsby build && gatsby serve
.
This is all you need to do to get Fathom tracking working. You should start to see page views being collected.
Database
I use a database to track how many reactions each of my article have received as well as for polls. In fact, a part of this case study is about how I added page reactions that you can read if you want to find out more about the implementation in detail.
Firebase Implementation Overview
In previous iterations I bundled firebase in with Gatsby to use it within my front-end but I have moved all firebase interaction to Gatsby functions that can be called from anywhere. This makes my bundle size smaller and makes it easier to manage.
If you would prefer to use Firebase from within Gatsby itself you can check out
gatsby-plugin-firebase
. There's also a great article by Kyle Shelvin that has a very solid introduction to using it. I used that in combination with react-firebase-hooks to read from the database.
Real-time Connections
I use these connections to present without sharing my screen but It also works as a great way to retrieve the active number of users on the site. Socket.IO is my go to solution for this kind of thing.
Implementing Socket.IO
To use sockets in the project we need to create a server to communicate between connected clients. You can get started quickly by throwing the following on a node server (probably best to keep this seperate from your Gatsby project):
server.jsCopied!1.const express = require("express");2.const PORT = process.env.PORT || 3000;3.const server = express().listen(PORT, () =>4.console.log(`Listening on ${PORT}`)5.);6.const io = require("socket.io")(server);7.io.on("connection", function (socket) {8.console.log("socket connected: " + socket.id);9.io.emit("userCount", {10.data: io.engine.clientsCount,11.});12.socket.on("disconnect", function () {13.io.emit("userCount", {14.data: io.engine.clientsCount,15.});16.});17.});
I use React.context
to retrieve data from the socket connection. This is a great way to pass data between components. The magic really all happens in these two useEffects:
Copied!1.useEffect(() => {2.setSocket(io("ws://localhost:3000"));3.}, []);4.useEffect(() => {5.if (socket) {6.socket.on("connect", () => {7.setConnected(true);8.});9.socket.on("disconnect", () => {10.setConnected(false);11.});12.socket.on("userCount", ({ data: count }) => {13.setCount(count);14.});15.}16.}, [socket]);
I usually wrap the root element in the context.
This wrapper needs to wrap our site content in a Redux provider.
src/state/wrapWithProvider.js
Copied!1.import React from "react";2.import { Provider } from "react-redux";3.import createStore from "./createStore";4.export default ({ element }) => {5.return <Provider store={createStore}>{element}</Provider>;6.};
In our store we add methods to handle any incoming actions from the server.
src/state/createStore.js
Copied!1.import { createStore, applyMiddleware } from "redux";2.import createSocketIoMiddleware from "redux-socket.io";3.import io from "socket.io-client";4.let socket = io("http://your-socket-server-here.com");5.let socketIoMiddleware = createSocketIoMiddleware(socket, "server/");6.function reducer(7.state = {8.count: 0,9.},10.action11.) {12.switch (action.type) {13.case "userCount":14.return Object.assign({}, { ...state, count: action.data });15.default:16.return state;17.}18.}19.let store = applyMiddleware(socketIoMiddleware)(createStore)(reducer);20.store.subscribe(() => {21.console.log("new client state", store.getState());22.});23.export default store;
Now if we connect any component using Redux we will have access to these values:
Copied!1.import React from "react";2.import { connect } from "react-redux";3.4.const Stats = ({ count }) => {5.return (6.<div>7.<h1>There are {count} users online!</h1>8.</div>9.);10.};11.const mapStateToProps = ({ count }) => {12.return { count };13.};14.15.const ConnectedStats =16.typeof window !== `undefined` ? connect(mapStateToProps, null)(Stats) : Stats;17.export default ConnectedStats;
Code Stats
To collect code statistics, I use cloc
. Cloc counts blank lines, comment lines, and physical lines of source code in many programming languages including JS! I am always impressed by how quickly it runs. I have created a custom script in my package.json that runs the tool.
Copied!1.{2."count_totals": "cloc --exclude-ext=svg --exclude-dir=animation-data --json src/ MDX/ > data/stats/count_total.json"3.}
I've included quite a few flags in the command so lets run through them:
--exclude-ext=svg
ignores files with the.svg
extension.--exclude-dir=animation-data
ignore the folder that contains my animation data.--json
create the output of the program in JSON.src/ MDX/
look at code only in my src and MDX directories> data/stats/count_total.json
write the output of the run to a file at the specified path.
I added gatsby-transformer-json
to ingest this data and sourced the file in my gatsby-config.js:
Copied!1.module.exports = {2.plugins: [3.`gatsby-transformer-json`,4.{5.resolve: `gatsby-source-filesystem`,6.options: {7.name: `stats`,8.path: `${__dirname}/data/stats/count_total.json`,9.},10.]
External Sources
I also use data that is collected for me by other services connected to this site most notably Github. These require no set up within the code.
Using Collected Data
Source Plugins At Build
I have created a couple of local plugins that I use to source data at build time. This data is then available in my graphQL layer to use on the page. Its important to note that data pulled into the site in this way is not updated in real-time. It only updates when the site is built and deployed. To stop myself needing to manually build it every-time, I have a cron-job set up on a node server that pings gatsby cloud at 9PM GMT:
Copied!1.const CronJob = require("cron").CronJob;2.const GatsbyWebHook = "your-gatsby-cloud-webhook-url";3.var job = new CronJob(4."0 00 21 * * *",5.function () {6.fetch(GatsbyWebHook, { method: "POST", body: "a=1" }).then(() =>7.console.log("Pinged Gatsby")8.);9.},10.null,11.true,12."Europe/London"13.);
There's a great cheat sheet for cron that I would totally reccomend using if you're setting this up.
Github
Github has an awesome API you can use to grab your GitHub stats and repos. Their most recent version is queried via graphQL which should be familiar to Gatsby users. Convenient! You can discover the data you can retrieve using the API explorer](https://developer.github.com/v4/explorer/).
In my gatsby-source-github-profile
plugin, I fetch repository information via this API and add it to my graphQL layer using the createNode
action. An Example fetching just forks and stars of a repo can be found below:
Copied!1.const fetch = require("node-fetch");2.const crypto = require("crypto");3.4.exports.sourceNodes = async ({ actions }, configOptions) => {5.const { createNode } = actions;6.const headers = { Authorization: `bearer ${configOptions.token}` };7.const body = {8.query: `query {9.user(login: "${configOptions.username}") {10.name11.repository(name: "personal-site") {12.id13.createdAt14.url15.forkCount16.stargazers {17.totalCount18.}19.}20.}21.}`,22.};23.const response = await fetch("https://api.github.com/graphql", {24.method: "POST",25.body: JSON.stringify(body),26.headers: headers,27.});28.const data = await response.json();29.const { forkCount } = data.data.user.repository;30.const stars = data.data.user.repository.stargazers.totalCount;31.createNode({32.forks: Number(forkCount),33.stars: Number(stars),34.id: "Github-Profile",35.internal: {36.type: `GitHubProfile`,37.mediaType: `text/plain`,38.contentDigest: crypto39..createHash(`md5`)40..update(JSON.stringify({ stars, forkCount }))41..digest(`hex`),42.description: `Github Profile Information`,43.},44.});45.};
Google Analytics
I take a similar approach with google analytics but instead of using REST I use the googleapis
npm package. In my implementation I am using V3 of the google analytics API but V4 has been out for a while. I am used to V3 so just feel more comfortable using it but will update this code soon. I make multiple queries but as an example here is my retrieval and ingestion of my site wide views to date:
Copied!1.const crypto = require("crypto");2.const { google } = require("googleapis");3.4.const SiteWideStats = await google.analytics("v3").data.ga.get({5.auth: jwt,6.ids: "ga:" + viewId,7."start-date": startDate || "2009-01-01",8."end-date": "today",9.metrics: "ga:pageviews, ga:sessions",10.});11.for (let [pageViews, sessions] of SiteWideStats.data.rows) {12.createNode({13.pageViews: Number(pageViews),14.sessions: Number(sessions),15.id: "All-site",16.internal: {17.type: `SiteWideStats`,18.contentDigest: crypto19..createHash(`md5`)20..update(JSON.stringify({ pageViews, sessions }))21..digest(`hex`),22.mediaType: `text/plain`,23.description: `Page views & sessions for the site`,24.},25.});26.}
Accessing Firebase Firestore Data
To retrieve data from the store is easy with the previously mentioned react-firebase-hooks
. I do just want to note that I use the hook “useDocumentOnce” as opposed to “useDocument”. This means that it reads the data once when loading the page, instead of continually reading. This reduced the amount of usage my database saw from thousands reads to hundreds of reads and therefore saves me money. The only downside is that the reactions may be out of sync if they are liked by someone else while you are still viewing the page.
Using this implementation I can retrieve reactions for a given article with the following code:
Copied!1.const [value, loading, error] = useCollectionOnce(2.firebase.firestore().collection("likes")3.);
Whats cool about this hook is that it gives us a loading boolean to indicate if the data is still being loaded and any errors can be retrieved to. If both loading and error are false then we know we have retrieved our data and can use it in our components:
Copied!1.if (!loading && !error) {2.console.log(value);3.}
Stats Page Design
Inspired by the gov.uk website, I wanted to make my stats human readable. To do this I opted to incorporate most of the stats into sentances.
I also created a graph for views by day. I intially implemented it using the recharts library but found it to be a little heavy-weight for my use case. I instead decided to just create it with divs:
Copied!1.allViewsPerDate.map((item) => {2.return (3.<div4.className="is-white-bg border-radius margin-1-r"5.style={{6.position: "relative",7.width: 100 / allViewsPerDate.length + "%",8.}}9.data-tip={`${item.node.views} views`}10.>11.<div12.className="is-pink-bg-always border-radius margin-1-r"13.style={{14.position: "absolute",15.bottom: 0,16.width: "100%",17.height: Math.floor((item.node.views / maxViews) * 100),18.}}19./>20.</div>21.);22.});
You'll be able to see here that I am just placing one div on top of the other to create the bars but I think it is quite effective:
Implementation
Code for the page can be found on my personal sites repo.
Footer Stats (Bonus)
Looking at my retrieved google analytics data in my graphQL layer, it occured to me that I had everything I would need to put the stats that each individual page had received in its footer. I would need a fallback though. If I add a new page, it won't have any stats until the following build. In such instances I thought I would display site wide stats.
In my footer I use a StaticQuery
to retrieve this information from my graphQL layer:
Copied!1.<StaticQuery2.query={graphql`3.{4.allPageViews {5.edges {6.node {7.totalCount8.path9.sessions10.}11.}12.}13.siteWideStats {14.sessions15.pageViews16.}17.}18.`}19.render={(data) => {20.const pageViews = data.allPageViews.edges.find(21.(item) => item.node.path === location.pathname22.);23.if (pageViews) {24.let node = pageViews.node;25.return (26.<p className="footer-sub-text margin-0 margin-1-t margin-2-b">27.{node.totalCount} page views | {node.sessions} sessions |{" "}28.<Link to="/stats" className="is-special-blue">29.All Stats30.</Link>{" "}31.-{" "}32.<Link to="/disclaimer" className="is-special-blue">33.Disclaimer34.</Link>35.</p>36.);37.} else {38.// render site wide stats39.}40.}}41./>
The Result
You can check out the results using the magical teleportation device (buttons) below:
If you enjoyed this deep dive why not show me by using the article reactions? You can also sign up to my newsletter using the form below this article - no pressure though!