Why I delete my facebook account

When I was a student in the early nineties, like many people I was impressed by the concept and the reality of the Internet. A number of things that were very hard before the Internet, became possible.
I was an active person in student movement, mainly working on lowering the financial barriers to education. I saw (and still see) education as a very important tool for people to achieve something in life. For many people, education has been the tool that allowed them to escape from a circle of poverty.

The Internet opened a myriad of ways to improve this even more. It offered the possibility to lower the barriers to information, news, discussions, knowledge and wisdom. If we could make sure everyone, no matter what background, has access to the Internet, our collective knowledge would increase, which would benefit many things.

In those early days, I participated in the creation of some “platforms” that allowed many people to collaborate. Access to those platforms was mostly free, because we did everything in our spare time, driven by passion, and the infrastructure was often provided by universities.

I’ll never forget a discussion I had with a fellow student, in 1994. He told me that soon, access to information on the Internet would not be free anymore, and that business models would rise taking advantage of the opportunities of the Internet.
At that moment, I didn’t believe him. The Internet as I saw it was created, maintained, and expanded by people with vision, ambition, dreams, creativity, and with the guts to make things happen.

The naive me didn’t see that entrepreneurs and marketeers were already creating platforms to monetize the Internet. I don’t think there is anything wrong with that. I’m happy for those early visionaries who created a good business, leveraging the opportunities created by the Internet. I never really succeeded in transforming good ideas and code into a great business model, but it’s good that others managed to do so, and the economic gains thanks to the Internet are probably very important.

However, the most famous business model currently on the Internet — or more specific, on the World Wide Web, worries me. Many websites and platforms seem to be free, but they are not. Users get the idea that they are getting free access to news, information, a network of friends,… but they pay a lot. They give up their privacy and independency. The amount of trackers on websites is scary, our data is traded between huge IT companies and other commercial companies. We are the product.

If I compare the platforms we created in the nineties with what is today offered by Facebook, I don’t see a big difference in functionality. But there is a huge difference in the goal, and the way it is presented to the user. On Facebook, and by extension other sites/platforms that generate revenues by selling ads, the goal is to keep visitors as long as possible active on the platform, so that more ads can be shown. It’s a very simple rule, and it works well. In order to keep visitors, the platforms have to show content that a particular visitor wants to see — otherwise they leave. It’s not in the interest of Facebook to show neutral, balanced content. A strong pro or con post is more likely to get peoples attention — and their eyeballs. And thanks to the huge surveillance network of trackers and data processing, the platform knows exactly who you are and what posts will make you stay longer on the site.
Unsurprisingly, popular posts are often from people bragging about what they are buying and doing, begging for likes.

This is not the Internet I hoped for, in the early nineties. We can and must do better. Therefore, I #deletefacebook.

A Java approach for leveraging quantum entanglement in communication

In a previous post, I talked about leveraging quantum entanglement to create a secure communication channel. The post concluded with the following picture, which explains that if Alice and Bob each get one half of an entangled qubit pair (hence one qubit for each), they can use the value of their qubit to generate a secure key.

Alice and Bob sharing a pair of entangled qubits, provided by a “quantum entangler”

Before either Alice or Bob measures the value of the qubit they received, they have no idea if they will measure 0 or 1. In fact, because of the laws of quantum physics, nobody knows for sure. The entangled pair is in such a state that there is 50% chance that both Alice and Bob will measure 0, and 50% chance that they both will measure 1.

I wrote a simple Java application that demonstrates the different components of an end-to-end scenario in which Alice sends a message to Bob, and uses a key obtained by measuring the qubits she obtains from the Quantum Entangler service. She knows Bob has the same key, because the measurement he makes on his half of the entangled pair correspond to the measurement of Alice. In a simple scenario, Alice and Bob use one entangled qubit for each bit they transport. If Alice measures her qubit and she reads the value 0, she send the bit she wants to send unaltered. However, if the measurement of her qubit yields the value of 1, she flips the bit she wants to send. Bob applies the same rules, so Alice and Bob have a shared secret.

The application can be found at https://github.com/johanvos/qsocket. You can clone the repository, and run it as follows:

mvn javafx:run

This will show a bunch of output, where the most important statement is near the end:

Bob got bytes: [0, 1, 1, 0, 0, 1, 0, 0]

Those are the bytes that Bob is reading through his QSocket. They match the bytes Alice was sending, which can be seen in the Main class of the application:

    byte[] b = new byte[]{0,1,0,0,1,1,1,0};
    alice.getOutputStream().write(b);

Hence, the application demonstrates that the bytes sent by Alice are correctly received by Bob. The following high-level diagram demonstrates how this works:

high-level architecture

Alice and Bob talk to each other via a QSocket. While a QSocket does not directly extends a java.net.Socket, it has similarities. Most important is that a QSocket has an InputStream and an OutputStream that can be used to receive and send data. Bob and Alice send the data they want to send via these InputStream and OutputStream instances through the QSocket instance to each other. They don’t have to worry about encrypting the data, as this is done by the QSocket implementation. In order to do so, the QSocket implementation will request an entangled pair from the Entangler — that is, Alice’s QSocket will request half of the entangled pair and Bob’s QSocket will request the other half.

In the real world, the Entangler should be a physical device, that is capable of bringing 2 qubits in a Bell State, and send each of them to one of the parties in a conversation (Alice and Bob). In our simulation, the Entangler is a standalone service, which can run on a separate server — or on the same machine using a different port. The Entangler is the only component where quantum code is being executed — using the Strange quantum simulator.
The QSocket implementations use classical communication to talk to the Entangler, and to exchange commands. Keep in mind though that in practice, this component should be replaced by hardware capable of generating and transmitting entangled qubits.

I will go a bit deeper in the code in follow-up posts, but the interested reader is of course encouraged to have a look at the code in the github repository.

If you want to learn more about quantum computing for development, I highly recommend my book “Quantum Computing in Action“.

The code for the Strange quantum simulator can be found at https://github.com/redfx-quantum/strange

Entangled qubits and secure communication

Over the past years, I wrote a book on Quantum Computing for Manning Publications: Quantum Computing in Action.

I also wrote a Quantum Computing simulator in Java, named Strange.

In this article, I discuss a very basic approach of using quantum computing to improve network security. I briefly explain the concepts, and in a next article I plan to show real code that explains it in practice. I won’t go into details though. For more information about Quantum Computing and how it can be used together with classical computing, I highly recommend my book “Quantum Computing in Action”.

Quantum Computing offers a number of ways to make digital communication more secure. One of the main challenges in digital communication, where digital bits (a series of bits that are either 0 or 1) are sent from one computer to another, is that one should accept that those bits might be intercepted, and read/altered during communication over a physical carrier. However, the communication is necessary in order to transfer information.

In the following picture, Alice and Bob each have their own device (e.g. a laptop or a phone) and Alice wants to send a message to Bob by sending bits over the Internet.

The information flow between Alice and Bob uses wireless routers, fiber cables, routers,… and it should be assumed that whatever is sent over these channels might be read by third parties, e.g. Eve:

Since Alice and Bob don’t want Eve to read their conversation, they encrypt their messages before they send them over the internet. One of the popular approach to do this is to have Alice and Bob using a shared secret: a key that both of them know, but nobody else knows. A great example of how a shared secret can be created is the Diffie-Hellman key exchange algorithm (see https://en.wikipedia.org/wiki/Diffie%E2%80%93Hellman_key_exchange).

When Eve intercepts the encrypted message, she can’t make sense of it since she doesn’t have the key Alice and Bob are sharing.

One of the intriguing aspects of quantum physics is that nature gives us the concept of a shared secret as well, through the concept of superposition and entanglement. (see chapters 4 and 5 of https://www.manning.com/books/quantum-computing-in-action). Translated to computing and networking, this means that it is possible to generate 2 entangled qubits, and send them to 2 different devices (e.g. Alice and Bob). These qubits are initially in a Bell State, which means that they are in a superposition and entangled state. They both hold the values 0 and 1 simultaneously, and only when one of the qubits is measured, they “choose” a value of either 0 and 1. You can find much more background and info in my book, but the key idea here is that when Alice measures her part of the entangled qubit, it determines not only the value of her qubit, but also the value of Bob’s qubit. Alice and Bob do not need to send information over a classical line, since Bob’s qubit already “knows” the state of Alice’s qubit. If you think this is spooky and hardly possible since this seems to allow a transfer of information faster than light, you’re not alone. Even Einstein struggled with this concept (see https://en.wikipedia.org/wiki/EPR_paradox) .

Once Alice and Bob each hold a qubit with the same value, only known to them, they can use that value to generate a shared secret and apply many existing encryption technologies.

An important question is then: “How can Alice and Bob each obtain a part of an entangled pair?” It turns out that this is already possible today, over limited distances, and with limited nodes. It is especially worth mentioning the work of QuTech in Delft, where lots of research is done related to quantum networking. See for example https://qutech.nl/2021/04/15/dutch-researchers-establish-the-first-entanglement-based-quantum-network/ .

While the equipment to generate entangled qubits and transmit them over network infrastructure to different nodes is not yet widely available, it is very well possible to simulate this using Quantum Simulators, e.g. Strange, the quantum computing simulator I wrote in Java.

In a next post, I will show code that uses this simulator to transmit bits from Alice to Bob, encrypted using entangled qubits.

Continue reading here: https://johanvos.wordpress.com/2022/03/30/a-java-approach-for-leveraging-quantum-entanglement-in-communication/

A shell for prototyping scientific java applications

As I wrote in a previous post, I worked on data processing and visualization for my PhD in applied physics.
It sparked my work on Java, although I didn’t use Java for that particular part.
Like many researchers, I felt the overhead of creating an application with tests and a build environment, compiling and running it, was too high for some quick experiments. I needed to create a simple plot, change some parameters, plot again, and so on.

Clearly, Java has a number of benefits. For a long time, it is the number one language on Tiobe, and with 12 million developers worldwide it is easy to find a Java developer near you. Java has security as one of its corner stones. Add multi-threading, garbage collection and a vibrant ecosystem with lots of great libraries, and it is clear that Java is a very powerful platform.

There are some great scientific libraries available for Java. I’m a big supporter of SkyMind, and their deeplearning4j and nd4j API’s allow developers to create high-performant AI and numerical applications using Java. For the visualization, there are a number of tools on top of the JavaFX API’s like FXyz that can render complex 3D scenes leveraging hardware acceleration.

It would be great though, if that same Java language can also be used during research and prototyping. In that phase, things like security and scalability might be less important. But it would be a huge time and cost saver if the exact same code that you are
using in research and development can also be used in production — where security and scalability do matter.

Since Java 9, Java contains JShell, which is a REPL (Read-Eval-Print Loop) for Java. This is a very important piece for doing research and prototyping. Rather then editing code, compiling, running, debugging, the REPL approach is often easier if you need immediate feedback from a single expression without having to go through the other steps

I did some experiments with extending JShell with an existing JavaFX wrapper, the Nd4j libraries, and a wrapper for the JavaFX Chart API’s.

The result is a simple shell-based application that does not require build files (no ant/maven/gradle required) and that gives immediate access to the API’s. Code snippets can be saved and restored, and also embedded in end-to-end Java applications.

It is still very rough, but this screencast shows how to create an INDArray using Nd4j and doing some basic matrix operations.

numjava-nd4j

A second screencast shows how to create a simple function and plot the values for a specific range, thereby changing some parameters.

numjava-plot

Since we are doing some work with (distributed) deep learning using Java in the Mobile Enterprise at Gluon recently (e.g. see https://github.com/gluonhq/gluon-samples/tree/master/deeplearning-linearclassifier), I plan to enhance those experiments since I found out it really saves me time to do some quick prototyping in a REPL environment.

If you think this is useful for you too, please contact me with any kind of feedback.

My history with end-to-end Java in the community

Some recent evolutions in Java made me think about the history of Java and how I looked at the development of the language. The move from Java EE to EE4J and beyond, and the opening of client-side Java are clear signs that Java is a community platform rather than a company-controlled language.

THE EARLY DAYS

When I started my PhD in Applied Physics at Delft University of Technology more than 20 years ago, one of the main challenges was to combine computations and visualizations of lots of data.
My work was done as a collaboration between the faculties of Aerospace Engineering and Applied Sciences, and those groups are used to deal with huge amounts of data. Most of the work was done using Fortran or C/C++. I still love these languages and I admire the amazing work people have done with them.

But around that time (1996), there was a new language that introduced me to Object Oriented programming, and the advantages of encapsulation, reusability, multi-threading, security,… and that was Java.

At that moment, there were 2 obstacles preventing me from using Java for my PhD thesis: it didn’t work on Linux (I never managed to fluently move from the Apple ][ to the Macintosh without losing productivity so I converted to Linux) and it was slow.

I joined the Blackdown team, a bunch of people with the goal of porting Java to Linux, and we fixed that first issue. I still remember the weird looks and comments we got in those days: “Linux? Seriously? Who would ever want to run Java on Linux?”. Well, we wanted to do that so we did it.

I also do remember that in the initial days, we occasionally got blobs of new code from Sun Microsystems. The first time that the Swing code was added, I was very hopeful as I saw funny animations that might help me with my PhD.
I compiled the sources on my Sparc Station (I think it was a sparc 10) at work. It took 23 hours and 45 minutes to complete. Pretty long, but that’s ok — the end developer doesn’t have to compile Swing from code.

However, the performance of Java in the late nineties, and the lack of a real great UI framework forced me to use Fortran and C and I used the excellent Seismic Unix package (https://en.wikipedia.org/wiki/Seismic_Unix) for my visuals.

EARLY 2K

Gradually, the HotSpot compiler became better and better, and for a number of reasons the Java programming language made a lot of sense for enterprise backend development. I have been following the Java Enterprise development since iPlanet, the Sun One appserver, the release of the Spring Framework and the GlassFish Application Server. Writing Enterprise Applications in Java became the standard, for very good reasons. Java is available on almost all enterprise operating systems, and the platform and tools drastically increase developer productivity.

While doing lots of work on the enterprise side, I also spent time on embedded and mobile development as well. I wrote software for a telematics system, and we expanded that to mobile devices in general. Well, the “in general” is not completely accurate. At that time, the mobile landscape was pretty chaotic. We had the software that worked on the embedded telematics device also working on the Compaq iPAQ, the Sharp Zaurus (bought at JavaOne 2002) and a bunch of other devices.
There were two very big hurdles that we had to face:
1. there was no standard API for Java development in general and device management in particular
2. there was no way we could automatically provision our apps to end-user devices. Telco’s and device manufacturers kept control of the local apps.

The business model of the telematics company I worked for was mainly server-side focused: allow software companies to create telematics software (e.g. infotainment), and manage the delivery of this software in a secure, controlled and managed way to the on board telematics devices. That requires lots of business tools like billing, logging, administration,…

While the business model clearly was server-focused, everybody realized that in order to get data to the server, you need clients to generate this data, and send it to your server.

In general, I’ve been advocating for a Mobile First – Cloud First strategy for a long time. If you want incoming traffic to your cloud, you better start where the data originates from.

TODAY

The mobile and embedded world have changed. From a technical point, Java on the client has never been in a better shape.
We have Java 9 working on desktop, laptop, mobile, embedded.
We have JavaFX as a modern, cross-platform UI framework that works on desktop, laptop, mobile, embedded, leveraging hardware accelerated rendering.
Java developers are now capable of not only writing enterprise applications, but also controlling client apps that generate the data that feeds their enterprise applications.

The possibilities of Java on the client are immense. One of the key features of Java is that security is built-in from day 1. This is so much needed in todays client development. Cyber security is one of the things that worries me very much. Client developers have a huge responsibility in this area, and Java helps them securing the application and context.
Java is also very powerful, and using JNI you can leverage native libraries that unlock features like deep learning. You can write apps using deep learning algorithms using the same code on mobile and on clients. Doing some local processing on the client can eliminate the need of sending raw (privacy sensitive data) to servers.

As Java became bigger, it became obvious that there is no single company that can do all the development in all the corners of Java. When Oracle acquired Sun, it quickly became clear that Oracle is not interested in all aspects of Java. Oracle is a cloud-focused company, hence their investments in Java focus on Java for the Cloud.

But there are many other companies that benefit from Java in other areas, or that want to realize a link between client development in Java towards Enterprise development in Java on their clouds.

The Java Client ecosystem is very much alive and active. Every month, more than 30,000 developers download Scene Builder from Gluon. Scene Builder is a tool that allow developers to easily create JavaFX user interfaces in an intuitive way.
There are a number of excellent frameworks and libraries, that are created by many enthusiast developers. Every month, 2500 developers download the Gluon Mobile plugin for their IDE, allowing them to create cross-platform Java apps for iOS and Android, written in 100% Java. And this number is increasing month by month.
While over the years, Oracle has invested a lot in Java on the Client, it is clear that the wider community is now taking over.
As part of this movement, I became the project lead for the OpenJDK Mobile project, where the goal is to make sure the OpenJDK classes and VM code works on mobile platforms (iOS/Android).

In order to encourage interested third parties to be involved in the development of JavaFX, we created a mirror at GitHub at https://github.com/javafxports/openjdk-jfx.
This mirror is automatically pulling changes from the official OpenJFX repository, so its master branch is up to date with the official repository.
Developers can now fork the very latest JavaFX code, do their own experiments,
and hopefully contribute something back. Companies can evaluate new features in a more flexible way.

 

In summary, Java is an amazing thing. It’s a platform and a language, it works on enterprise cloud systems, on desktop, mobile, and embedded. It doesn’t fix all of the problems in IT, but one of its biggest assets is the fact that it is available everywhere and allows you to create robust, secure apps, leveraging tons of third party tools (IDE’s) and libraries. It is an ecosystem that is much bigger than a single company.
And if you want to be part of the Java success, you can just do it.

Cloud systems are becoming more attractive to developers

The IT industry is known for using terminologies that are hyped. In some cases, it appears that renaming an old approach with a catchier name leads to an increased usage of that old approach. In reality however, more things have changed than just the name. “The cloud” has become increasingly popular as a term, although many developers have been using some sort of remote, scalable, hosted service for decades. But the cloud environments that are being built today, are really game-changers.

I am very fortunate to be in close contact with engineers working on the biggest cloud systems. At Gluon, we have a SAAS product called Gluon CloudLink, which allows enterprise developers to extend their enterprise services into mobile apps. We offer Gluon CloudLink as a SAAS, and the service itself is hosted on AWS EC2 instances, using Amazon services as RDS and DynamoDB.

As a SAAS provider, the advantage of hosting your SAAS solution in a cloud environment is that developers can access it, regardless where their own infrastructure is hosted. While that sounds great, there is also a downside. Companies that require an IT infrastructure are often a customer with one of the bigger cloud providers, e.g. Amazon, Pivotal, Oracle, Google, IBM, Microsoft. Those companies have a trust relationship and a single account with that provider. While they can shop for third-party services and easily do a technical integration, it requires them to setup a separate billing process.

The move that we are seeing today is a tendency towards cloud marketplaces. In such a marketplace the cloud provider allows third party SAAS providers to offer their services towards customers of the cloud provider. This is a triple win:

  • the end customer has a single point of contact (and billing), which is the cloud provider
  • the cloud provider has a more complete offer for his customers, which keeps the customer happy and loyal
  • the SAAS provider benefits from the customer ecosystem and infrastructure provided by the cloud provider

In the past, when you had an account with a cloud provider, you almost exclusively used the services developed and provided by the cloud provider. But in many cases developers want to use other services as well. Rather than developers having to deploy the other services on the cloud systems, it is often more beneficial if those services are offered by the cloud provider as well, and integrated into their infrastructure.

This concept makes it possible for cloud providers to offer services in different vertical markets, that require specific expertise. Their offering can become much wider than today.

I see similarities between this evolution and opening Apple AppStore and Google Play Store to third party developers. In both cases the big providers (cloud providers or mobile providers) realised the advantages of working together with third party software developers, rather than doing everything themselves.

SAAS developers, on the other hand, benefit from more visibility and easier integration. Depending on how the cloud provider organises it, the partner network can be extremely valuable for the SAAS developers, as it can faciliate the sales and marketing process for them. In the end, the customer of a cloud provider expects a fairly complete offering from its provider, hence the latter benefits from hooking the customer with the third-party service that best matches its requests.

All big cloud providers either have this in place, or are working on it. And it’s going at a high speed. Personally, I am pleasantly surprised to see how fast this evolution is taking place. The cloud vendors are typically big companies, and it is not always easy for them to innovate quickly. But the amount of flexibility and innovation that I have been seeing over the past year in some of the biggest cloud providers is really astonishing. I have the greatest respect for the engineers working on those systems, as they have to fit a number of old and recent requirements into a single product, e.g. security, microservices, container support, scalability, availability, service discovery, binding,…

While it is hard for a small company to create their own IAAS/PAAS infrastructure like the big cloud providers do, there are a myriad of other opportunities. Those cloud marketplaces can only work when there is a sufficient amount of third party services that is being offered via the marketplace. Hence, it is a great time for SAAS companies to offer their product via the cloud marketplaces of the different cloud vendors.

Happy New Year, Java

In 2016, Java turned 21. According to the Tiobe index, Java is still the most popular language. Whether you like Java or not, the fact that a language  leads the pace for such a long time is pretty amazing.

My most popular tweet of 2016 was this one:

One of the reasons why Java is still great, is because the JDK team thinks very hard about long-term consequences of design changes.

I really think this is one of the not-so-hidden reasons for the success of Java. There is a thin line between not innovating and innovating too fast. The Java language, and by extension the Java platform clearly stays on that line. Granted, the difference between the Java language and the Java platform allows for some margins. The Java Platform allows for other, new, experimental languages to be developed. Some of those languages become successful by themselves, others gain less critical mass. But in most cases, those new JVM languages learned the Java core developers something about the possibilities of the Java platform, and about the wishes/trends in the developer community.

At the JavaOne conference in 1999, Java was sort of divided in 3 parts: Java SE, Java ME and Java EE. While the distinction between those parts is often artficial, it makes it easy to discuss the state of the different parts of Java.

Java SE

Java 8 has been released in 2014, and it has been picked up very fast by a large part of the developers. We saw a number of great books, tutorials, and articles on the new features of Java 8. Functional programming is hot today, and the Java language allows developers to take advantage of functional programming.

Unless something really strange happens, 2017 will be the year of Java 9. It took a long time and a number of delays to get this release ready, but I think that is very reasonable. First of all, it is not easy to envision the modularity system that developers are willing to work with for the next decade or so. Secondly, even after this became clear, there were a huge number of small and large hurdles that needed to be overcome before the JVM works with the new module system, while providing most of the backward compatibility.  Actually, it is nothing less than amazing that most of the design decisions taken by James Gosling and his team when they created the Java language in the early nineties are still very much valid.

Mixing new paradigms with older, mature concepts is never an easy task. When Java 9 will be released later this year, I don’t expect a big revolution from day one. Developers will gradually move from a classpath-based deployment to a modular system. The JVM will be future-proof once again, and that is the important part.

Java EE

The past year was rather turbulent for the server-side Java continent. Ironically, the unexplained slowdown of Oracle’s work on the Java EE 8 specification in the end triggered a number of initiatives. While the year started as a boring, stagnating time, it ended with lots of activity, coding, and collaboration.

The Java EE 8 release is now scheduled for the end of this year, shortly followed by the Java EE 9 release. The Java EE 8 release will fix some loose ends, and it will contain a number of important new JSR’s that will make the daily life of the Java EE developer much easier (e.g. JSON-B). 

One of the terms that are used most often in 2016 is without doubt “MicroServices”. Unfortunately, it is used as a marketing term as well and that makes discussions often harder. The MicroProfile initiative is a very interesting move by some vendors and luminaries in the Java server side space, and they are really doing great things with Java on the server side. The Spring framework is also a valid option for creating MicroServices and there are already many companies and organisations using this in production. It is interesting to see how the Java server-side market, which is often described as slow or inert, is embracing “new” concepts like MicroServices.  Since server-side Java dominates the enterprise market, this is really great news. I like The fact that at least 3 big and (more or less) open initiatives  are working on how to integrate the microservices concept into their stack. Many great enterprise developers are contributing to one or more of those initiatives. I sincerely hope that the communication lines between Java EE, MicroProfile and Spring will be used a lot in 2017. It is great to have multiple initiatives, as this gives developers and companies a choice. But interoperability is another key word in our industry, hence communication  is important as well.

Java ME

Originally, the Java platform has been developed with mobile or embedded devices in mind. This idea is often shadowed by the success of Java on the server side. Also, in the past there were major issues for running Java on resource-constrained devices. Today, the situation is changing. Mobile and embedded devices are becoming more powerful, and thanks to standardisation, it is now easier to extend the Java Write Once Run Anywhere idea to mobile and embedded.

This is an area I am currently involved in, and with Gluon, we are working very hard to have and maintain first-class support for Java on mobile and embedded devices. While the technology is now mostly in place, the business model is the next to tackle. Combining mobile/embdedded functionality with Cloud systems is very important, as it allows the value created by billions of small devices to be analysed and used in enterprise systems. That is the core of my daily work at Gluon today, and I really believe this will be boosting in 2017.

In summary, even after more than 20 years, it is still very exciting to be a Java developer. There are simply so many opportunities where Java can make the difference. And on top of that, the Java ecosystem is very much alive, and it allows developers to participate and shape the future.

Bundling our JavaFX Activities into Gluon

I am happy to announce that all JavaFX work that LodgON is doing, will now be done under the Gluon brand.
For those of you who have been following the LodgON JavaFX activities, this will not come as a surprise. LodgON developers are co-founders of Gluon, and by combining our expertise with the expertise from the other founders and employees, we believe we can build a great company dedicated to JavaFX products and services.

As part of this move, we transfered the JavaFXPorts document website, (http://javafxports.org) which was previously hosted by LodgON, into the Gluon documentation site. The Gluon documentation is generated using asciiDoc, and this is what we are using in JavaFXPorts from now on as well. You can read more about this at http://gluonhq.com/javafxports-has-a-new-home

Maintaining text-files in a repository is much easier to me than  working on HTML files or using a content management system. As of today, the burden of updating websites with version numbers and release notes will not delay new releases of JavaFXPorts anymore. I can focus on code, and keep the asciidoc files up to date.

For our existing JavaFX customers, the integration within Gluon is a huge benefit. LodgON’s expertise is mainly in writing clean code, and integrate it with enterprise functionality. While these competencies are now in Gluon as well, Gluon also has developers that are experts in user interface and user experience. As a consequence, we can now help existing LodgON customers with their questions on UI, UX and controls.

In the end, this should benefit a much larger audience than the  existing LodgON customers. Having a one-stop shop for all Java Client related products, services and development, will lower the adoption barrier for Java on the client, including mobile clients.

Java in 20 words

Java turns 20 this week. Java is more than a programming language or a platform. It is a large part of the life of many developers. In this blog post, I try to summarize what Java means to me in 20 words.

  1. Blackdown

    I started my Java career on a SparcStation with Linux. Unfortunately, there was no Java SDK for Linux/SPARC by then (1995). Good enough, there was a very small group of people working on a port of Java to Linux, and I joined that team. This was the Blackdown team. Many people said we were crazy, as nobody was interested in running Java on Linux.
    I learned a couple of things: if you need something and it doesn’t exist, you can always do it yourself. And don’t listen to people saying “this will never work”.
    With Blackdown, we got an award for the best team contribution to Java at one of the early JavaOne editions, but in general, I feel the role of Blackdown was more important than many people realize. I miss my old buddies who did most of the job. It’s even hard to find references to the team that did it, but good enough we have the WayBackMachine, so here is an old page of contributors that need a huge Thank You: http://web.archive.org/web/20070807032743/http://www.blackdown.org/java-linux/java-linux-contact.html

  2. Genius

    Java was probably the right language at the right moment. But that alone doesn’t explain it’s ongoing success. I am very privileged to have had conversations with a number of people that created Java or that have defined the language/platform over the years, and those are truly geniuses.

  3. Compiler

    The Java Compiler is my best friend when developing. The compiler is the one that gently reminds you you are about to do something you probably don’t want to do. This makes Java different from non-compiled languages, where the gently reminder is replaced by a non-avoidable runtime error.

  4. Sun

    Credits to Sun for providing the best environment the Java language needed in its early years. It is really a pity that Sun doesn’t exist anymore, but people working on Java code know that Sun is still there. (com.sun…..) I even download my SDK’s from http://java.sun.com !

  5. Everywhere

    One of the early promises of Java, and still relevant. Actually, even more relevant than ever. Java is a language that is not tied to a particular container or environment. It is not the hammer that can be used to fix everything that looks like a nail, though. It really adapts to the environment, from the very small to the very large ones.

  6. Fun

    The best moments on work days are those where I say: “Are we done with this meeting/administration/…? I’ll go back to coding now”. Every day, once a day, give yourself a present. Don’t plan it. Don’t wait for it. Just let it happen. And write some Java code. I feel more relaxed when writing code.

  7. Jobs

    My job as a Java developer is a perfect job. I get paid for what I like to do: writing Java code. If you look at job sites, you see there are many opportunities for Java developers. As Java developers, we should be very grateful that we can support our families by doing what we want: writing Java code.

  8. Oracle

    When Oracle acquired Sun, many people feared this was the end of Java. However, it was also clear to many that Oracle relies on Java. As Larry Ellison said during his JavaOne appearance after the acquisition was announced: “Sun didn’t make money on Java, but Oracle does.” So Oracle keeps investing in Java. Not only because they love us and the Java language, but also because it benefits their business.

  9. Ecosystem

    The Java ecosystem is an interesting one. Because Java is so big and works on so many different environments, one company can not oversee and control everything. This allows for interesting partnerships and it provides opportunities. Recently, I co-founded Gluon, which is a great proof of this ecosystem: Oracle isn’t interested in commercializing everything that is Java related. At Gluon, we focus on Java Client technologies that are very mature, but that are not in a commercial Oracle offering.
    There are examples of other technologies that are controlled by a single company. I don’t think that is good for innovation.

  10. JavaOne

    This is really one of the things that make Java unique. The yearly pilgrimage to San Francisco is something I really encourage every Java developer to do at least once in his career. It would be nice to have this in May again though, not at the same moment as Oracle Open World. It would be more affordable to developers that way.
    JavaOne paved the way for a number of other successful Java conferences. If you look at the growth of the Devoxx family and other conferences, you realize that Java developers like to gather together to learn and discuss.

  11. Community

    It’s hard to put a number on it, but the Java community is co-responsible for the success of Java. As I said earlier, Java is too big to be controlled by a single company. I don’t remember when the term “Java Community” was used first, but I remember the “Community One” editions that accompanied JavaOne 2007 and later. It is hard to describe the Java Community. I think the power of the Java Community is the combination of individuals and companies that have different goals, different ambitions, different ethnic and religious backgrounds, but that all benefit from a strong Java platform.

  12. Respect

    This is probably related to the community aspect. I’ve seen many debates and discussions in the Java world. Strong opinions, healthy discussions, and also discussions that were completely irrelevant. But almost always those discussions are held with respect for each other. That may sounds obvious to many, but if you have a look at forums where other Internet technologies are discussed, it is by no means plausible. Occasionally, I am shocked when I read discussions between leading people in other technologies.
    There are heavy discussions in the Java world as well, but as far as I am aware, all are held with mutual respect.

  13. JavaDoc

    While working on Java code, there is one thing you should always have at hand: the JavaDoc. The uniform format of JavaDoc is in my opinion a big contributor to the success of Java. When I have to work with a new framework, the first thing I look at is the JavaDoc. Java technology never feels strange, as long as there is JavaDoc.

  14. IDE

    I have to admit, I made a switch here. Until late 2007, I wasn’t using an IDE at all. I tried most of them in the early 2000’s, but I felt they were too slow. So I kept using vi for a very long time. I thought that was pretty cool, but when I told James Gosling about this, he gave me a strange look and recommended me to have another look at NetBeans. That’s what I did, and that’s what I’m still using (still using vi occasionally though).
    In general, IDE’s are making developers way more productive, and they make their job relaxed. Good enough, the popular Java IDE’s are very fast up-to-date and are very inter-operable.

  15. Jini

    There are a number of Java technologies that I really loved but that didn’t reach the major break-through we were hoping for. Jini is my favorite technology that didn’t make it. I really loved the idea of connected devices, talking to each other and to big servers… Great technology, but they spelled it wrong I guess. Calling it “IoT” and wa
    iting a decade made it more viable. I remember it was rumored Jini was an acronym for “Jini Is Not Initials”. So on the plane back from JavaOne where Jini was announced, I wrote my own implementation of Jini, and called it Gina, which means “Gina Is Not Acronym”. To be fair, I can’t consider Gina a lost investment. At the contrary, I learned about the need for connecting embedded and big devices, and that was the foundation for my later work on RedFX and more recently on Gluon Cloud.

  16. Feel

    James Gosling often talked about “the Feel of Java” — e.g. see http://www.win.tue.nl/~evink/education/avp/pdf/feel-of-java.pdf. When working with Java, you get used to how it “feels”. The current architects of the Java language are evolving the language while preserving the Feel of Java.
    I think this is very important, and so far the subtle balance in preserving the Feel of Java while leveraging new capabilities is working very well.

  17. Performance

    I remember Java got bashed in the early days for being so slow. I never worried about this, because I was confident it was just a matter of time before the runtime engines would provide the required performance. Today, Java allows to create incredible performant applications. It puts the power in the hands of the developers. If you use the tools correctly (e.g. Java EE pooling configurations, Java concurrency,…) you can create powerful applications.

  18. Reliable

    When I’m working on end-to-end project, ranging from embedded devices with cables and leds to big enterprise systems in virtual clouds, I encounter issues in many different layers. When the issue is in one of the Java layers, I feel very relaxed as there is nothing that prevents me from fixing it. Java is your reliable friend.

  19. JCP

    The Java Community Process as it is today is one of the most open standardization organizations that I am aware of. The JCP facilitates the process for defining Java specifications. They put lots of effort in involving the Java community in this process. Long ago, Java critics were complaining the specs were created by a few people behind closed doors. Today, this is by no means the case. I don’t know of any other Internet technology that is so open for input from its own developers.

  20. Future

    Java is 20 years now. If you search for announcements of “the death of Java”, it is amazing how dumb some self-declared analysts are. I typically don’t spend time arguing with them. If you like Java, use it. If not, you have two options: either you try to improve it, or you use something else.
    Because Java showed over the past 20 years that innovation is possible while maintaining the feel of Java, I am confident that the Java Platform has a bright future.

LodgON and RoboVM delivering JavaFX on Mobile

Earlier this week, a partnership between RoboVM and LodgON was announced in which LodgON assists RoboVM with porting JavaFX to iOS client development, leveraging the RoboVM compiler. You can read more about the partnership in the press release and in this article on voxxed.com.

LodgON has been very active in porting JavaFX to Android devices. It brings the “write once, run anywhere” promise one step closer to reality. With a growing number of mobile devices connected to the Internet, and with more and more of these devices getting applications from an “appstore”, it becomes increasingly important for Java Developers to have a uniform way to create applications that can be uploaded to these appstores.
At this moment, it is already possible to create JavaFX applications, and convert them into Android packages or iOS applications. The process for doing this, however, used to be very different for Android and for iOS. Also, although both the iOS port and the Android port were based on OpenJFX, they used different versions.

In order to make it easy for developers to write applications that target all mobile devices, we have to take away the differences between a JavaFX Android and a JavaFX iOS application. Therefore, it makes sense for RoboVM and LodgON to work together more closely. The RoboVM Compiler is a brilliant piece of software that translates Java bytecode into native iOS code before it is uploaded to the AppStore, and thus before it is executed on the device.

During the first weeks of our collaboration, we already managed to bring JavaFX to the same level on Android and iOS. The JavaFX runtime we make available for the mobile platforms is based on the JavaFX 8u40 runtime for Windows, MacOS X and Linux. JavaFX 8u40 will be available as part of Java SE 8u40, which is expected to be released in March.

Not only the codebase is now the same, we also created a new unified Gradle plugin that allows to create Android and iOS packages very easy. We used Gradle as we want to leverage the “convention over configuration” paradigm. Simply including the “javafxmobile-plugin” plugin in your build.gradle file will create tasks that create the required mobile packages, or that even install them on your devices. Check the Getting started page on javafxports.org for more detailed information.

This is the result of a few weeks of work only, and there is definitely lots of work to do. We don’t support all of the features of JavaFX 8 yet, e.g. we are missing support for Dialog and for the media package. We need to work on performance, especially on the iOS port. The Gradle plugin needs much more configuration options, and we should make it possible to integrate with other build systems as well. With the recent RoboVM – LodgON partnership, I’m confident that we can tackle these challenges, and we will get closer to make JavaFX on mobile a viable option for developing client applications.

Having the JavaFX runtime available on mobile devices is one step, but I realize more is needed in order to create visually attractive and business-friendly mobile applications. Good enough, things are moving forward in that area as well. Expect more exciting news in a few months.