Software has been eating the world for far longer than four years. But developers think of software a little differently. Insofar as we're solving real-world problems, we're not thinking mainly of the metal. And as the problems get bigger and the solutions more complex, pragmatic (and not-too-leaky) abstractions become more important than ever.
The upshot: From a productivity-oriented developer's point of view, frameworks are eating the world. But which ones are eating how much of which parts of the world?
Given the insane variety of superb open source frameworks available, I picked our top 5 open source frameworks of 2015 not from a single ranked order, but from all levels of the stack. (For front-ends, I focused on the web and, still more narrowly, true client-side frameworks—simply because browsers and mobile devices are growing increasingly capable, and because SPAs [single page applications] and the like avoid sending data over the wire unnecessarily.)
1. Presentation: Bootstrap
Let's start at the top of the stack: the presentation layer, the stuff developers and us folks both touch. Here the clear winner remains Bootstrap. The forecast looks outstanding, leaving old alternatives, such as Foundation, and new kids, such as Material Design Lite, in the dust. Bootstrap dominates usage trends on BuiltWith, and on GitHub remains easily the most starred and most forked framework of all time.
2. Web MVC: AngularJS
As the web platform continues to mature, developers enjoy increasingly well-crafted abstraction-distance from the still-markup-colored DOM. The work begun by XMLHttpRequest reaches its zenith in modern Single-Page Applications (SPAs), and the most popular SPA framework by far is AngularJS.
What's so special about AngularJS? In a word: directives. Just a little
Perhaps a bit sadly, the most aggressive concept behind AngularJS—two-way data binding, which effortlessly keeps views and models synced—is going away in Angular2, which is "very close" to beta release. So a bit of the magic will disappear, but so will some massive performance headaches at scale and some hair-pullingly tough debugging (just think about two-way for a moment and feel the cliff drop out from under your feet)—a trade off that grows more valuable as page size and SPA complexity balloon.
3. Enterprise Java: Spring Boot
What's so great about Java? Fast, mature, comprehensive class library, gigantic ecosystem, write-once-run-everywhere, active community—but not painless bootstrapping. Even hard-core Java developers resort to Ruby or Python to write quick one-off programs (admit it). And yet Java continues to dominate the enterprise for those other reasons listed above.
Enter Spring Boot, the boilerplate evaporator—a framework that lets you fit a working Spring application in a single tweet:
— Rob Winch (@rob_winch) August 6, 2013
No unpleasant XML config, no sloppy generated code. How is this possible? Simple: Spring Boot has some pretty strong opinions. Read that tweet above and suddenly it all makes sense! When you realize that the framework automatically spins up an embedded servlet container to handle incoming requests on port 8080—a decision you didn't explicitly configure it to make, but a conventional (and firewall-friendly) call.
How hot is Spring Boot? It's by far the most forked and most downloaded Spring project (not counting the master framework itself). And in 2015, for the first time, Google logged more searches for spring boot than for spring framework.
4. Data processing: Apache Spark
Once upon a time (in 2004), Google developed a programming model (MapReduce) that generalized many distributed batch processing job structures, then wrote a famous paper about it; then some Yahoo folks wrote a Java framework (Hadoop) that implemented MapReduce and a distributed file system to simplify data access for MapReduce tasks.
For nearly a decade Hadoop dominated the "Big Data" framework ecosystem, despite the limited problem space addressed (optimally) batch processing—partly because business and scientific users were accustomed to batch analysis of large datasets anyway. But not all large datasets are optimally processed in batches. In particular, streaming data (such as sensor inputs) and data analyzed iteratively (like machine learning algorithms love to do) don't like batch processing. So dozens of new Big Data frameworks were born, new programming models, application architectures, and data stores gained traction (including a new cluster management system, decoupled from MapReduce, for Hadoop itself).
But of all these new frameworks, Apache Spark (developed at Berkeley's AMPLab) is the easy-choice 2015 standout. Surveys (DZone report; Databricks infographic; Typesafe report) show huge growth in Spark adoption. GitHub commits have been growing linearly since 2013, and Google Trends shows exponential (yes, literally) growth in searches over 2015.
So Spark is popular. But what does it do? Well, very fast batch processing; but this depends on one killer feature that allows for vastly more programming models than Hadoop. Spark makes data available in Resilient Distributed Datasets (RDDs) that remain in memory on multiple nodes after processing, but without replication (by storing info on how to recreate; compare to CQRS, pragmaticism, Kolmogorov complexity). This (obviously) lets algorithms to iterate without reloading from a (s)lower rung in the distributed memory hierarchy. And this means that batch processing need no longer suffer the indignity of the "long stroke" of Nathan Marz's lambda architecture. RDDs even allow Spark to simulate true (push) stream processing by running small batch jobs fast enough to keep latency within "effective streaming" bounds for many applications.
5. Delivery: Docker
Okay, so Docker isn't a "framework" in the sense of "code library, generously defined, that imposes a specific set of conventions to solve large and recurrent problem sets". But if frameworks are just things that let you write code at a more suitable level of abstraction, then Docker is a framework extraordinaire. (Let's call it an exoskeletal framework, just to mix metaphors confusingly.) And it would feel funny to name "top 2015 anythings for developers" without including Docker on the list.
Why is Docker great? First, why are containers (earlier: FreeBSD Jail, Solaris Zones, OpenVZ, LXC) great? Simple: isolation without a full operating system; or, safety and convenience of a VM with far less overhead. But isolation takes many forms (
chroot comes to mind, or really any virtual memory system), and it's pretty easy to
systemd-nspawn without Docker. Only being able to isolate processes isn't enough. Why is Docker especially great?
Two reasons: Dockerfiles ("the new tarballs") add portability; and the Dockerfile format is now a de-facto standard. The first takes the pain out of application delivery (while earlier containers just created lighter VMs). The second makes container-sharing social (and not just on DockerHub). I can try your application without mucking around for hours not trying your application. (Remember how freeing