Why I use Java

There are probably better languages than Java, depending on work requirements. But I haven't seen anything yet to pull me away.
208 readers like this.
Coffee beans

Pixabay. CC0.

I believe I started using Java in 1997, not long after Java 1.1 saw the light of day. Since that time, by and large, I've really enjoyed programming in Java; although I confess these days, I'm as likely to be found writing Groovy scripts as "serious code" in Java.

Coming from a background in FORTRAN, PL/1, Pascal, and finally C, I found a lot of things to like about Java. Java was my first significant hands-on experience with object-oriented programming. By then, I had been programming for about 20 years, and it's probably safe to say I had some ideas about what mattered and what didn't.

Debugging as a key language feature

I really hated wasting time tracking down obscure bugs caused by my code carelessly iterating off the end of an array, especially back in the days of programming in FORTRAN on IBM mainframes. Another subtle problem that cropped up from time to time was calling a subroutine with a four-byte integer argument that was expecting two bytes; on small-endian architecture, this was often a benign bug, but on big-endian machines, the value of the top two bytes was usually, but not always, zero.

Debugging in that batch environment was pretty awkward, too—poring through core dumps or inserting print statements, which themselves could move bugs around or even make them disappear.

So my early experiences with Pascal, first on MTS, then using the same MTS compiler on IBM OS/VS1, made my life a lot easier. Pascal's strong and static typing were a big part of the win here, and every Pascal compiler I have used inserts run-time checks on array bounds and ranges, so bugs are detected at the point of occurrence. When we moved most of our work to a Unix system in the early 1980s, porting the Pascal code was a straightforward task.

Finding the right amount of syntax

But for all the things I liked about Pascal, my code was wordy, and the syntax seemed to have a tendency to slightly obscure the code; for example, using:

if … then begin … end else … end

instead of:

if (…) { … } else { … }

in C and similar languages. Also, some things were quite hard to do in Pascal and much easier to do in C. But, as I began to use C more and more, I found myself running into the same kind of errors I used to commit in FORTRAN—running off the end of arrays, for example—that were not detected at the point of the original error, but only through their adverse effects later in the program's execution. Fortunately, I was no longer living in the batch environment and had great debugging tools at hand. Still, C gave me a little too much flexibility for my own good.

When I discovered awk, I found I had a nice counterpoint to C. At that time, a lot of my work involved transforming field data and creating reports. I found I could do a surprising amount of that with awk, coupled with other Unix command-line tools like sort, sed, cut, join, paste, comm, and so on. Essentially, these tools gave me something a lot like a relational database manager for text files that had a column-oriented structure, which was the way a lot of our field data came in. Or, if not exactly in that format, most of the time the data could be unloaded from a relational database or from some kind of binary format into that column-oriented structure.

String handling, regular expressions, and associative arrays supported by awk, as well as the basic nature of awk (it's really a data-transformation pipeline), fit my needs very well. When confronted with binary data files, complicated data structuring, and absolute performance needs, I would still revert to C; but as I used awk more and more, I found C's very basic string support more and more frustrating. As time went on, more and more often I would end up using C only when I had to—and probably overusing awk the rest of the time.

Java is the right level of abstraction

And then along came Java. It looked pretty good right out of the gate—a relatively terse syntax reminiscent of C, or at least, more so than Pascal or any of those other earlier experiences. It was strongly typed, so a lot of programming errors would get caught at compile time. It didn't seem to require too much object-oriented learning to get going, which was a good thing, as I was barely familiar with OOP design patterns at the time. But even in the earliest days, I liked the ideas behind its simplified inheritance model. (Java allows for single inheritance with interfaces provided to enrich the paradigm somewhat.)

And it seemed to come with a rich library of functionality (the concept of "batteries included") that worked at the right level to directly meet my needs. Finally, I found myself rapidly coming to like the idea of both data and behavior being grouped together in objects. This seemed like a great way to explicitly control interactions among data—much better than enormous parameter lists or uncontrolled access to global variables.

Since then, Java has grown to be the Helvetic military knife in my programming toolbox. I will still write stuff occasionally in awk or use Linux command-line utilities like cut, sort, or sed when they're obviously and precisely the straightforward way to solve the problem at hand. I doubt if I've written 50 lines of C in the last 20 years, though; Java has completely replaced C for my needs.

In addition, Java has been improving over time. First of all, it's become much more performant. And it's added some really useful capabilities, like try with resources, which very nicely cleans up verbose and somewhat messy code dealing with error handling during file I/O, for example; or lambdas, which provide the ability to declare functions and pass them as parameters, instead of the old approach, which required creating classes or interfaces to "host" those functions; or streams, which encapsulate iterative behavior in functions, creating an efficient data-transformation pipeline materialized in the form of chained function calls.

Java is getting better and better

A number of language designers have looked at ways to radically improve the Java experience. For me, most of these aren't yet of great interest; again, that's more a reflection of my typical workflow and (much) less a function of the features those languages bring. But one of these evolutionary steps has become an indispensable part of my programming arsenal: Groovy. Groovy has become my go-to solution when I run into a small problem that needs a small solution. Moreover, it's highly compatible with Java. For me, Groovy fills the same niche that Python fills for a lot of other people—it's compact, DRY (don't repeat yourself), and expressive (lists and dictionaries have full language support). I also make use of Grails, which uses Groovy to provide a streamlined web framework for very performant and useful Java web applications.

But is Java still open source?

Recently, growing support for OpenJDK has further improved my comfort level with Java. A number of companies are supporting OpenJDK in various ways, including AdoptOpenJDK, Amazon, and Red Hat. In one of my bigger and longer-term projects, we use AdoptOpenJDK to generate customized runtimes on several desktop platforms.

Are there better languages than Java? I'm sure there are, depending on your work needs. But I'm still a very happy Java user, and I haven't seen anything yet that threatens to pull me away.

Chris Hermansen portrait Temuco Chile
Seldom without a computer of some sort since graduating from the University of British Columbia in 1978, I have been a full-time Linux user since 2005, a full-time Solaris and SunOS user from 1986 through 2005, and UNIX System V user before that.

8 Comments

Hi Chris! Impressive resume and nice write-up. Thanks for sharing.

I recommend that you take a look at Kotlin. Kotlin Koans are a nice way to start, and kts (Kotlin Scripts) may replace your Groovy. I really like Kotlin's very strong type system, while being really similar to Java and Groovy (sans the flipped declarations which takes a few days to get used to).

Thanks for the comments, TWiStErRob.

I've looked at Kotlin, but so far it hasn't "spoken to me". Perhaps one day it will! Groovy feels like I always wanted Python to feel, and it's led me to Grails, which has given me a great appreciation for All That Is Wonderful in Java (like for instance Spring).

Anyway, it sounds like you're pumped on Kotlin, so therefore I beseech you to write one or more articles on opensource.com about it!

In reply to by TWiStErRob (not verified)

I decided to use php and html long ago because you can't see the source code in your browser. You only see the results. I know there are many ways to prevent this, but as a newbie web programmer it seemed like a no-brainer for keeping database data safe.

Thanks for the comment, MJH. As far as I know, no server-side web frameworks expose source code in the HTML they deliver to the browser (I'm speaking of things like PHP, Grails, Rails, Django...). Some don't require source code to be on the server, instead executing byte code or native code. For instance, with Grails (with which I'm most familiar), the dev makes a .war file on the development machine which is Java .class files and HTML, CSS, images, etc) and then puts that on the server, where the application server (for example, Tomcat) unpacks it and serves it out.

In reply to by MJH (not verified)

In other words, your reason for liking Java is the same reason for me liking Ada. Strong typing and syntax reduce debugging times a lot. Also formal checking (with SPARK, a subset of Ada, nothing to do with Apache) helps you to reduce the bug count.

Just I prefer the verbose syntax "if .. then .. else .. end if" to the compact {}-based syntax.

Thanks for the comment, Riccardo. Strong static typing is a big win in Java as far as my experience is concerned. I agree that we differ on preferring terse versus verbose syntax, and to each her/his own, I say.

The other thing that makes a huge difference for me is all the good stuff that comes with Java. For example, back in "the good old Pascal days" where I had severely limited memory available, I would build accumulators based on code values by having an intermediate lookup array that converted a code value (say in the range 1-500) to an "occurrence index" (say in the range 1-25). That occurrence index would be used to index into an array of pointers to heap structures to accumulate. Of course this type of structure was prone to carelessly using a code value instead of index of code value to lookup in the array of pointers, which would bring on all sorts of hard to track bugs.

Awk's associative arrays, and later Java's full set of HashMaps, got around this kind of problem entirely. Of course by then my memory restrictions weren't nearly as problematic, either. And the combo of things like HashMap with the type checking provide by generics was a further boon to detecting errors at compile time.

So, not really being familiar with what kind of "batteries" Ada brings, I don't know for sure that we're quite on the same page! But anyway, again thanks for writing, and I invite you to consider writing an ariticle "Why I use Ada" for opensource.com!

In reply to by rpr

I agree wholeheartedly - I started with Java in 1996 and haven't looked back although I still have some significant legacy code bases in C++ which I need to maintain on rare occasions..

New languages come and go all the time (goodbye Ruby and hopefully sanity has finally reached the people who thought writing JavaScript on the server side was a good idea).

Some of these languages say, for example, "I can do this incredible text manipulation in new language XYZ and I end up typing 27% less characters than if I implemented the same text manipulation in Java."

In response to claims like these I say "well done, here's your slow golf clap: when my job as a developer becomes 97% typing code and manipulating text in weird ways I'll switch to your new, flavor of the month XYZ language".

I think hackers tend to see coding as largely an exercise in typing and so saving keystrokes seems like a productivity win. To them, things like type safety that require typing in type specifications for any variable or argument, are just annoying extra keystrokes with no real benefit (LoL!)

In contrast I tend to spend a lot of time evaluating different options for the OO models I am building BEFORE I start typing anything - time spent sketching up multiple implementation alternatives on the whiteboard saves days, months, weeks compared to times when I've just "hacked together the first thing that comes into my head".

Thanks for your comment, Golman. Generally I agree with you - there's a lot (perhaps too much) appreciation for shiny new objects in the developer's world.

Having said that, I believe there is value in brevity and conciseness, especially as expressed in Don't Repeat Yourself (DRY). To me, functional interfaces in Java 8 and onward are a huge benefit to code maintainers as well as developers, simply by reducing the boilerplate code arising from "here" objects whose only purpose is to provide an implementation for a single function. The obvious fact that this reduces typing is kind of by the way; the real benefit is the code is more compact and readable, and the "what" is much more clearly visible when not surrounded by large quantities of "how".

The whole enhanced collections is another good example of this, from my perspective, with internal iterators and lambdas being seriously good stuff.

My Java (and Groovy) functional fu is still pretty rudimentary. For example, if I have an input file where each line contains several different key-value pairs to be loaded into several different maps, I know of no way to process that only once and deliver all maps as a functional result. Maybe it's there, and I just haven't yet seen it; but in Groovy, where I tend to do most of this kind of stuff, I find myself declaring all the maps and then using an "each" to iterate through the file, assigning map1[key1] = value1, map2[key2] = value2, etc as I go.

100% in agreement on the spending time on design concept. I find these days that an alternative to whiteboarding is to prototype with Groovy. I'm not a terribly visual person, so the lack of diagrams doesn't perturb me. I might make several prototypes before I settle on one that seems to be headed in a good direction; and I often use the prototyping as an excuse to test relative performance, too. Same story when SQL is involved. But I suspect this all may be more of an indicator as to what works with the kinds of problems I find myself solving, rather than a general rule.

Anyway, thanks again for your thoughtful comment.

In reply to by Golman (not verified)

Creative Commons LicenseThis work is licensed under a Creative Commons Attribution-Share Alike 4.0 International License.