How to live demo a web app with lousy internet

Squid HTTP proxy enables you to run a software demonstration, no matter how good or bad your internet connection.
316 readers like this.
Open source network cable

Opensource.com

Live demos are the bane of professional speakers everywhere. Even the most well-prepared live demo can go wrong for unforeseeable reasons. This is a bad thing to happen while you're on stage in front of 300 people. Live demos of remote web apps are so fraught with peril that most people find other ways of presenting them. Screenshots can never fail, and local sandboxes won't fail on overloaded conference internet connections. But what if we can't set up a local sandbox in time for our talk? What if our database is huge and complex? What if our app has animation and interactions that we can't show with screenshots?

What if you could record your web application's use and then replay the stored responses at the right time? Luckily, it's easy to proxy HTTP, the protocol that web browsers and web servers use to communicate with each other. This means you can put an intermediary between your browser and the server to do whatever you want. Often proxying does content filtering (e.g., corporate filters, parental filters). Proxies can cache data on a server closer to the user to speed up a website.

In this tutorial, I'm going to use a web proxy in a similar way: I'll cache my content and serve that cached data to my web browser. However, I'm going to run my proxy on the same machine as my web browser. And, I'm going to set it up to cache only the things that I want. This way I can run a live demo on an unstable connection.

Install and configure Squid HTTP proxy

First, I need to install and configure the proxy. I'm on a Mac, so I installed the Squid HTTP proxy via Homebrew, a free package manager for MacOS. After the proxy is installed, I need to configure it.

For my live demo, I want to cache the application I am trying to demo and any other content the application needs. Caching anything else is unnecessary. To do this, Squid uses access control lists (ACLs). I'll configure an ACL with a list of domains that I want to cache and deny everything else. For maximum coverage, I'll add both the host name and the IP addresses to the ACL. Since HTTP proxies are also used for DNS, most of the time the proxy is looking up the DNS records. But sometimes a browser already knows the IP and will just tell the proxy to fetch from the IP.

Here's my list of domains and IPs:

acl cacheDomain dstdomain beta.cpantesters.org
acl cacheDomain dstdomain api.cpantesters.org
acl cacheDomain dstdomain www.cpantesters.org
acl cacheDomain dstdomain 212.110.173.51
acl cacheDomain dstdomain cdnjs.cloudflare.com

The first three domains are the applications that I'm running. The fourth is the IP address for my application server; all the domains are on the same machine. The last is CDNJS, the JavaScript content delivery network (CDN) where I get my JavaScript. In order for my application to work, I need to cache all the JavaScript and CSS that I depend on from CDNJS.

Once I've listed what I want to cache, I can forbid any other domains from being cached:

cache deny !cacheDomain

Next, I need to tell Squid where to put my cache and how much disk space to use. Homebrew's Squid configuration has a cache_dir line commented out. I need to enable it and increase the available disk space to ensure my data stays cached. When the disk space is used up, Squid starts deleting old cached data, which I can't have during my demo.

# Uncomment and adjust the following to add a disk cache directory.
cache_dir ufs /usr/local/var/cache/squid 1024 16 256

The first number at the end of the line is the cache size in MB, which I adjusted to 1024 (1GB).

Finally, I must make sure that I can use Squid's management API, and that it's open only to the local machine. This should be the default, so I look for these http_access lines and add them if they don't exist.

# Only allow cachemgr access from localhost
http_access allow localhost manager
http_access deny manager

After allowing cache manager access from localhost, I need to disable the cache manager password:

cachemgr_passwd none all

I'm finished with the configuration file. And, now that I've configured my proxy, I can start it up. In Homebrew, the command is brew services start squid, but your platform may have a different requirement. This command gets the proxy started and waiting for requests. Next I need to configure my browser to use the proxy.

Configure the web browser

Configuring a web browser for an HTTP proxy depends on what browser and OS you use. If you're using Chrome or Safari on MacOS, you can go to System Preferences to configure a proxy. However, if you're using Firefox, you can configure the browser to use a proxy and leave the rest of the system alone. Other operating systems have other ways to configure proxies, so you should check your OS's documentation.

There are some good browser plugins for managing HTTP proxies. If you're using Chrome, try Proxy SwitchyOmega, and for Firefox use FoxyProxy Standard. Unfortunately there aren't good proxy plugin options for Safari or Internet Explorer.

Run through the demo to cache the content

Once I've configured my proxy, I can run through my demo to test it. I need to do this on a good internet connection. As I run through the demo, my browser will ask the proxy to fetch all the demo's data. As the proxy does this, it caches it on disk. Since my computer is online, Squid will follow the caching rules that the web server asks it to. This means caching for a specific length of time and possibly revalidating the data to see if it changed.

As I run through my demo, I should make sure that my cache is being used. The easiest way to do that is to read Squid's log. In my configuration, it's located at /usr/local/var/logs/access.log. Inside are lines that look like this:

1498020228.970    203 ::1 TCP_MISS/200 3653 GET http://beta.cpantesters.org/chart.html? - HIER_DIRECT/212.110.173.51 text/html
1498020229.523    314 ::1 TCP_REFRESH_MODIFIED/200 8130 GET http://api.cpantesters.org/v3/release/dist/Statocles - HIER_DIRECT/212.110.173.51 application/json

The important parts of this line are the URL and the status. TCP_MISS/200 means "this request was not in our cache, and the remote server returned a 200 OK HTTP response." TCP_REFRESH_MODIFIED/200 means "this request was in our cache, but we refreshed it from the remote server, which returned a 200 OK HTTP response." This is my cache building and refreshing itself because I'm on a stable connection. Once I have some data in my cache, I'll start seeing things like this:

1498063273.261      0 ::1 TCP_INM_HIT/304 299 GET http://beta.cpantesters.org/chart.html - HIER_NONE/- text/html
1498063281.831      0 ::1 TCP_MEM_HIT/200 8187 GET http://api.cpantesters.org/v3/release/dist/Statocles - HIER_NONE/- application/json

TCP_INM_HIT/304 means "The cache responded to this request with a 304 Not Modified response." TCP_MEM_HIT/200 means "The cache responded to this request with a 200 OK HTTP response." These are the responses I want: The cache is serving responses, not the remote server.

Run the demo

Now that my cache is operating well on a stable connection, I can run my demo on an unstable one. First, I want to make sure that my cache does not try to access the remote server (i.e., it's operating in Squid's "offline" mode). To do this, Squid has a management client called squidclient that I can use to toggle offline mode.

$ squidclient mgr:offline_toggle
HTTP/1.1 200 OK
Server: squid/3.5.26
Mime-Version: 1.0
Date: Tue, 04 Jul 2017 21:16:36 GMT
Content-Type: text/plain;charset=utf-8
Expires: Tue, 04 Jul 2017 21:16:36 GMT
Last-Modified: Tue, 04 Jul 2017 21:16:36 GMT
X-Cache: MISS from gwen.local
Via: 1.1 gwen.local (squid/3.5.26)
Connection: close

offline_mode is now ON

Squid's offline mode minimizes attempts to get remote content. Since I cached all my content running through my demo, this means Squid will be serving my demo.

So now I can run my demo anywhere without worry! All the remote content is served by the local machine, so it doesn't matter how good or bad the conference WiFi is. As long as I stick to things I've already cached, my web application runs perfectly.

Tags
User profile image.
Doug has been a web developer since JavaScript was invented. He maintains <a href="http://cpantesters.org">the CPAN Testers project</a>, organizes <a href="http://chicago.pm.org">the Chicago Perl Mongers</a>, leads <a href="http://hashcss.com">the #css channel on Freenode</a>, and posts talks and blogs at http://preaction.me.

Comments are closed.

Creative Commons LicenseThis work is licensed under a Creative Commons Attribution-Share Alike 4.0 International License.