I Built Coronavirus Live Monitor - stats, news, and WHO press releases on the virus all in one place

  • Gabriel Romualdo

  • March 21, 2020

Today, I am proud to release my latest web app and project: Coronavirus (COVID-19) Live Monitor, your hub for news and information on the Coronavirus outbreak. The code can be found on the GitHub repo.

Go check it out!

Site Demo

The site includes three main sections: stats, WHO press releases and health resources, and latest news.

The stats section includes a total case count, as well as total deaths and recoveries and their percentages within the total case count.

The WHO press releases and health resources section includes a video of the latest WHO press conference about the virus and links to latest situation reports and WHO news, as well as various links to health resources such as the NHS (U.K.) and the CDC (U.S.A.).

Finally, the latest news section includes news articles about the virus from a vast array of news sources. News on the virus is also sorted into country based the user's IP address, allowing for users to read news in their own language, localized for their area.

How It Was Built (in around 2-3 days total)

If you're interested in reading specifically how this was built, I've written a detailed description of the various steps I took to take this from 0 lines of code to a working web app in the course of a few days and a few more commits to GitHub.

If you choose not to continue reading, I hope you take a look at the site if you have a chance.

If you do continue reading, I hope you find my development process interesting and check out the code on GitHub.

1. Ideas and Wireframing

Before I wrote a line of code for this project, I made sure to know exactly what I was trying to build, and which technologies I would be using.

One thing I kept in mind while working on this is the large amount of information around this epidemic. At the time of writing, it can seem almost impossible to escape headlines of the virus, whether that be scrolling through YouTube, on Instagram or Twitter, or even COVID-19 related posts on The DEV Community. This constant flow of information into people about this virus has in some cases caused a lot more panic than helpful action.

So, bearing this in mind, I decided it best to not create my own coronavirus informational content sourced from health experts, but instead just link to them as health resources, as I am not an expert myself.

My main goal with this project was simply to get something working, fast.

I drew out three sections of a basic site, one large section with a case count, another with latest news, and a larger one in the center with linking to latest W.H.O. press releases and other health resources such as the CDC (U.S.A.) and the NHS (U.K.).

I made the decision to write the site entirely in plain HTML, CSS, and JavaScript, simply because my goal was fast development speed, and I am personally very proficient and most confident with just plain HTML, CSS, and vanilla JS. One of the reasons I didn't choose a framework for this is because it is a smaller project, and the more maintainable and less-confusing aspects of frameworks had less of a positive effect in my case.

2. Setting Up the HTML and CSS

With this project, I went for a very component-like based CSS structure. Styles were reused in most places in my CSS, allowing for quicker development time.

I set up the basic HTML and CSS with the three sections I had chosen (stats, information, and latest news), and made sure to make each section a component.

Here's what that might have looked like:

<div class="col">
    <div class="panel"><!-- Code here... --></div>
</div>
<div class="col">
    <div class="panel"><!-- Code here... --></div>
</div>

In this example, the panel class was a component, styled with rounded corners, some padding, and a background color.

The site had numerous reused components, such as a panel header, a tabs system, and several more.

3. Building the News Panel

I have some experience building news aggregators in the past, for example my news site Kalva, which organizes articles from hundreds of news sources into one place, searchable and sortable by source and country.

For that site, I used NewsAPI to get articles from various sources and sort and search them. I used NewsAPI for the news panel of this site as well.

I wrote a basic JavaScript fetch script to get latest articles and headlines with the keyword "coronavirus" in them, and then display those articles in the news panel.

NewsAPI also has a feature that allows articles to be sorted by country, which I implemented in an select box in the news panel.

To display news from the user's current country, I used IPAPI to determine the user's country based on their IP address, and then displayed news from only that country.

4. Building the Cases and Stats Panel

The first thing I needed for the cases and stats panel was actual statistics. At the time of writing the code, I didn't have access to APIs like The COVID Tracking Project, so I chose to use a basic web scraping and cronjobs system.

The general idea was to scrape a site with stats every couple minutes, and then update my own site with those stats.

I would a web scraper in PHP, which upon request updated a JavaScript file with the current stats in a few global variables.

That web scraper in PHP was then requested by a Python program running locally every few minutes.

You can see specifically how I did this on the code on GitHub.

5. Building the Latest Updates and Health Resources Panel

The last panel included two tabs, one with latest updates and a press video from the W.H.O., and another with links to health resources.

For fetching the latest W.H.O. press video, I first attempted a similar system for my PHP web scraper to get the current case data.

My web scraper in PHP used a basic HTTP request to fetch HTML, and unfortunately, the W.H.O. site used client-side JavaScript code to update their press videos, client-side scripts which unfortunately a basic HTTP request in PHP cannot run.

This type of problem is becoming very common as client-side JavaScript is increasingly becoming used for significant DOM updates, particularly in the cases of frameworks like Reactjs.

To fix this, I instead used a scraping program in written in JavaScript and Node.js, with a handy web scraping tool and NPM package called Puppeteer, which allows for (invisible) Chromium windows to be opened with a specified URL, and then that URL to be scraped after the client-side scripts have run.

I connected my JavaScript code with Puppeteer to a basic Express.js server which I then requested by a Python cronjobs system that scraped the site periodically and then put the press video URL in a JavaScript file for the frontend code to see.

6. Making it Work on Mobile

After everything was done, I spent some time writing the CSS to make everything work smoothly on mobile.

On mobile, I split the entire site into three tabs: news, information, and stats. Tab functionality was written with JavaScript DOM updates, and to make sure the current tab remained the same when reloading, I used a URL hash.

The URL Hash looks something like this:

https://covid19.xtrp.io/ --> default tab
https://covid19.xtrp.io/#cases --> cases and stats tab
https://covid19.xtrp.io/#news --> news tab

One key advantage of using a URL hash is that, when a hash is updated or added, the page is not forced to reload. This allows for a seemless user experience while also making sure the page is correctly displayed on load.

A hash is easily set via the location.hash JavaScript global.

That's All For Now!

If you liked this post, check out the site and star it on GitHub!

I'll be updating this site at xtrp.io a bit more regularly in the near future as well as releasing new projects, and be sure to check out some of my other posts if you liked this one.

Thanks for scrolling.

— Gabriel Romualdo, March 21, 2020