10 Language and Technology Trends for 2016
New technologies are constantly being developed, and existing ones keep changing. Which programming languages and technologies will become even more important in 2016? In this article, I gaze into the crystal ball to identify 10 current language and technology trends that will be worth your time in pursuing next year. My prognostications are based on information gleaned from Google Trends, GitHub, TIOBE Software, and other websites (such as Indeed.com's job search). The first section of this article focuses on language trends, and the second section focuses on technology trends.
Language Trends
To further your career, what programming languages should you target for 2016? This section identifies several languages to consider, and I explain why they're important. My choices are based largely on information gleaned from the most recent TIOBE Index at the time of writing, as well as GitHub's language trends and data from Google Trends. Finally, I considered language applicability to the trending technologies covered in this article.
C and C++
C is a general-purpose, structured programming language that's biased toward system programming. C++ is a general-purpose, object-oriented programming language that's an outgrowth of C and is also biased toward system programming.
You might not think of the C and C++ languages as trendy, but the data shows otherwise. According to TIOBE Index (see Figure 1), C is in second place and C++ in third place based on search results across multiple search engines.
Figure 1 The C language has dropped to second place (after Java), while C++ has risen to take third place in the TIOBE Index as of early December 2015.
The situation differs when examined from GitHub's perspective (see Figure 2), which determines language popularity based on hosted projects.
Figure 2 GitHub shows the C language dropping, while C++ has risen. The most recent data is from mid-August 2015.
Finally, let's consider Google Trends. Figure 3 shows the trend graph for the C programming language topic, and Figure 4 shows the trend graph for the C++ programming language topic.
Figure 3 Interest in the C language declined somewhat until around 2007, when interest seemed to stabilize.
Figure 4 Interest in the C++ language continues to be strong.
The enduring interest in C and C++ probably has a lot to do with their usefulness in embedded programming. At one time, many developers believed that C was the better choice for embedded development. However, that opinion has more recently evolved to where C++ is also widely used in the embedded arena. Also, the fact that many Internet of Things devices lack the processing power to run higher-level languages has given C and C++ an edge in this area.
If you're thinking about a career in programming embedded devices, consider learning C or C++. According to the Indeed.com job search site, at least 38,000 jobs are waiting for C++ developers, and around 130,000 jobs are waiting for C developers.
Java
Java is a general-purpose programming language that's concurrent, class-based, object-oriented, and designed to have as few implementation dependencies as possible. Java applications are compiled to bytecode, which executes on any Java-supported platform, leading to a high degree of portability. Java is also a software platform largely consisting of a virtual machine for executing bytecode.
The TIOBE Index ranks Java as the most popular programming language (refer to Figure 1). GitHub ranks Java as the second most popular programming language (refer to Figure 2). If you enter Java programming language into Google Trends, you'll see that interest in Java has been steadily increasing since 2007. Java's pace of evolution is largely responsible for the enduring interest. For example, enter Java 8 into Google Trends, and you'll see a sharp uptake in Java's popularity, which is most likely the result of introducing Lambda expressions and the Streams API. Conversely, interest in the Java software platform and Java virtual machine has fallen.
Java is widely used in enterprise computing. It's also widely used in Big Data (discussed later) contexts via projects such as Apache Hadoop. Another widely used domain is embedded devices and the Internet of Things, where Java's portability and security features are advantages. Finally, Java is widely used to write source code for Android apps. However, the Java version for Android is based on Apache Harmony and not on Oracle's version of Java.
Many jobs are currently waiting for Java developers to fill them. For example, a recent search for Java jobs on Indeed.com revealed nearly 90,000 jobs in the United States. Java will undoubtedly gain more interest in 2016, when Java 9 arrives with its Java Module System. This capability will make it easier to move Java to many more platforms, resulting in many additional job opportunities.
JavaScript and ECMAScript
JavaScript is a high-level, dynamic, untyped, and interpreted programming language, standardized in the ECMAScript language specification. Although the TIOBE Index ranks JavaScript as the seventh most popular language, GitHub ranks JavaScript as number 1 (refer to Figure 2). Google Trends for JavaScript programming language also shows a steadily increasing interest in JavaScript.
What accounts for JavaScript's popularity? Basically, JavaScript is the language of the web browser, which serves as a universal client on platforms ranging from mobile devices to desktops. As well as being the top choice for creating the client side of HTML5-based web apps, JavaScript is also popular for server-side development via node.js, which is an open-source, cross-platform runtime environment for developing server-side web applications.
JavaScript is also starting to benefit from the significant changes introduced by ECMAScript 6 and 7. These changes, which include classes, modules, a for of loop, and static methods, help to simplify the development of complex applications. Check out Mozilla's "ECMAScript 6 support in Mozilla" and "ECMAScript 7 support in Mozilla" pages to find out how much of these specifications have been implemented in the Firefox web browser.
According to Indeed.com, at least 63,000 jobs are available to JavaScript developers. This might be the language for you in 2016.
Python and R
Python is a general-purpose, high-level programming language that emphasizes code readability and expressing concepts in fewer lines of code than is possible in languages such as C++ or Java. R is a programming language and software environment for statistical computing and data visualization, which Python also supports. If you need to choose between these languages, check out the DataCamp.com post "Choosing R or Python for data analysis? An infographic."
According to the TIOBE Index, Python is more popular than R. GitHub reinforces this position by listing Python and not R in its top 10 languages. Python's general-purpose nature probably accounts for its greater popularity. However, Google Trends indicates about the same level of interest in both languages, which may be due to their usefulness with Big Data (discussed later). If you're planning to become involved with Big Data, consider learning Python and/or R.
What does the job situation look like for Python and R? A recent Indeed.com inquiry revealed at least 43,000 Python jobs and 57,000 R jobs. Learning either language is time well spent.
Technology Trends
To further your career, what technologies should you target for 2016? I've identified six worthy candidates in this section. Each technology is already changing society, and its influence will become more pronounced next year.
3D Printing
3D printing creates three-dimensional objects via various processes. Also known as additive manufacturing, 3D printing relies on computer control to form an object by printing successive layers of a material. Materials currently in use include thermoplastics, advanced nickel alloys, carbon fiber, glass, conductive ink, rubber, modeling clay, and biological matter.
The 3D printing topic on Google Trends indicates significant interest in this technology. If you plan to get into 3D printing from a career perspective, check out the i.Materialise.com post "Getting Started with 3D Printing: Skills & Resources You Need."
What kinds of 3D printing jobs can you anticipate? The Business News Daily article "10 3D Printing Jobs on the Rise" (September 2013) identifies 3D design, 3D computer-aided design (CAD) modeling, research and development, biological and scientific modeling, architecture/construction modeling, and other job categories. The more recent Fabbaloo post "CNBC Reports on 3D Print Job Growth" (November 2015) points out that Lockheed Martin wants to hire at least 120 new workers skilled in 3D design and printing.
Big Data and Data Visualization
According to Wikipedia, Big Data is a broad term for data sets so large or complex that traditional data-processing applications are inadequate. Challenges include analysis, capture, data curation, search, sharing, storage, transfer, visualization, and information privacy. Data visualization involves the creation and study of the visual representation of data in order to extract meaningful information. Processing and analyzing Big Data is challenging for data visualization.
A Google Trends search on Big Data shows that interest in this technology has been rising since around 2011. Similarly, a search on data visualization shows interest increasing since around 2007. One reason for growth could be the surge in activity involving the Internet of Things (discussed later), which is a top generator of Big Data from all kinds of devices that must be analyzed and visualized.
Languages and technologies widely used with Big Data and data visualization include Python, programming with Big Data in R, Apache Hadoop, and NoSQL. The previously mentioned DataComp.com infographic shows how Python and R compare from a data-analysis perspective.
In May 2014 InfoWorld.com published "Hadoop, Python, and NoSQL lead the pack for big data jobs." The information in the article was obtained from statistics gathered by the tech job site Dice.com. A recent visit to this site shows that R has made significant gains in terms of R-related Big Data and data visualization jobs.
Cloud Computing
Cloud computing is a kind of Internet-based computing in which shared resources and information are provided to computers and other devices on demand. It provides users and enterprises with various capabilities to store and process their data in third-party data centers.
At the time of writing, a Google Trends search for cloud computing showed that interest in this technology started to surge after 2007 and peaked around 2012 before falling somewhat to a more modest and sustainable level, which isn't surprising given that the early hype has largely died down as the technology has matured.
In late 2014, Forbes.com published the article "Where Cloud Computing Jobs Will Be in 2015," which noted nearly 400,000 IT cloud computing jobs in the United States alone. Less recently, the influential Gartner Inc. (an American marketing, market research, and advisory firm providing insights on information technology topics) released a report stating that cloud computing will form the bulk of IT spending in 2016, which should translate into even more jobs.
Internet of Things
Wikipedia describes the Internet of Things (IoT) as the network of physical objects or "things" that are embedded with electronics, software, sensors, and network connectivity, enabling these objects to collect and exchange data. The IoT lets objects be sensed and controlled remotely across existing network infrastructure, creating opportunities for more direct integration between the physical world and computer-based systems, and resulting in improved efficiency, accuracy, and economic benefit. Each thing is uniquely identifiable through its embedded computing system and is able to interoperate within the existing Internet infrastructure. Experts estimate that the IoT will consist of tens of billions of objects by 2020.
A Google Trends search for Internet of Things reveals interest in this technology starting after 2005 and surging around 2010, probably due to Chinese Premier Wen Jiabao calling the IoT a key industry for China, which plans to make major investments in IoT. In 2011, IPv6, which provides IP addressing for IoT devices, was revealed to the public via World IPv6 Day. That year also witnessed the maturation of Arduino and other hardware platforms that make the IoT accessible to do-it-yourselfers who are interested in the IoT.
Interest in the IoT continues to grow. Gartner.com forecast in November 2015 that 6.4 billion connected "things" will be in use in 2016, up 30% from 2015. How does this translate into jobs? According to Indeed.com, nearly 10,000 IoT jobs are available in the United States alone. The number of jobs should grow significantly as the IoT surges ahead.
If you're interested in pursuing an IoT career, you'll need to acquire some important skills. In late 2014, Forbes.com published the article "Ready for the Internet of Things? 5 Skills You'll Need," listing the necessary skills as being an associative thinker, a collaborator, a communicator, knowledgeable, and persistent. You should also become familiar with related technologies, such as Big Data, data security, and data analytics.
Mobile Computing
Wikipedia describes mobile computing as human-computer interaction in which a computer is expected to be transported during normal usage. Mobile devices range from smartphones and tablets to wearables such as the Apple Watch. The two dominant mobile-device operating systems are Android and iOS.
Mobile computing is expected to surge even higher next year. For example, one school of thought is that tablets will overtake notebook computers by 2016. Other people believe that wearables that can run third-party apps will take the lead over wearables that don't run third-party apps in 2016.
According to Indeed.com, nearly 37,000 mobile device jobs are currently available in the United States. You can expect greater job growth as mobile devices become even more ubiquitous. The Gartner.com report "Top Strategic Predictions for 2016 and Beyond: The Future Is a Digital Thing" (registration required) forecasts that by 2018 two million employees will be required to wear health and fitness devices as a condition of employment.
Virtual and Augmented Reality
Popularized by devices such as the Oculus Rift and Google Cardboard, virtual reality replicates an environment that simulates physical presence in a real or imagined world and lets the user interact in that world. By contrast, Wikipedia describes augmented reality as a live direct or indirect view of a physical, real-world environment whose elements are augmented (or supplemented) by computer-generated sensory input such as sound, video, graphics, or GPS data. Augmented reality is popularized by Google Glass.
According to Google Trends, interest in virtual reality began mounting around 2014. Interest in augmented reality took off around 2009, but has retreated somewhat. A recent job search on Indeed.com backs up this trend by showing around twice as many jobs in virtual reality as in augmented reality. Although the number of virtual/augmented reality jobs is quite low at the moment, articles such as Road To VR's "200 Companies Now Hiring—A Look at the Growing Virtual Reality Jobs Market" and The Market Mogul's "The next big trend: Augmented Reality" indicate strong growth potential and an increasing number of jobs for these technologies over the next several years.
Final Thoughts
If your career is stagnating, or you just want to obtain a job involving current high-impact programming languages and other technologies in 2016, become an expert in at least one of the languages and technologies I've discussed here. Each is trending and supported by many job opportunities. Furthermore, the synergy from using these and other languages and technologies to transform our world into something new is exciting. Be part of it!