Creating a REST API with Spring Boot and MongoDB

Spring Boot is an opinionated framework that simplifies the development of Spring applications. It frees us from the slavery of complex configuration files and helps us to create standalone Spring applications that don’t need an external servlet container.
This sounds almost too good to be true, but Spring Boot can really do all this.
This blog post demonstrates how easy it is to implement a REST API that provides CRUD operations for todo entries that are saved to MongoDB database.
Let’s start by creating our Maven project.
Note: This blog post assumes that you have already installed the MongoDB database. If you haven’t done this, you can follow the instructions given in the blog post titled: Accessing Data with MongoDB.

Creating Our Maven Project

We can create our Maven project by following these steps:

  1. Use the spring-boot-starter-parent POM as the parent POM of our Maven project. This ensures that our project inherits sensible default settings from Spring Boot.
  2. Add the Spring Boot Maven Plugin to our project. This plugin allows us to package our application into an executable jar file, package it into a war archive, and run the application.
  3. Configure the dependencies of our project. We need to configure the following dependencies:
    • The spring-boot-starter-web dependency provides the dependencies of a web application.
    • The spring-data-mongodb dependency provides integration with the MongoDB document database.
  4. Enable the Java 8 Support of Spring Boot.
  5. Configure the main class of our application. This class is responsible of configuring and starting our application.

The relevant part of our pom.xml file looks as follows:

<properties>
<!-- Enable Java 8 -->
<java.version>1.8</java.version>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<!-- Configure the main class of our Spring Boot application -->
<start-class>com.javaadvent.bootrest.TodoAppConfig</start-class>
</properties>

<!-- Inherit defaults from Spring Boot -->
<parent>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-parent</artifactId>
<version>1.1.9.RELEASE</version>
</parent>

<dependencies>
<!-- Get the dependencies of a web application -->
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
</dependency>

<!-- Spring Data MongoDB-->
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb</artifactId>
</dependency>
</dependencies>

<build>
<plugins>
<!-- Spring Boot Maven Support -->
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
</plugin>
</plugins>
</build>


Additional Reading:

Let’s move on and find out how we can configure our application.

Configuring Our Application

We can configure our Spring Boot application by following these steps:

  1. Create a TodoAppConfig class to the com.javaadvent.bootrest package.
  2. Enable Spring Boot auto-configuration.
  3. Configure the Spring container to scan components found from the child packages of the com.javaadvent.bootrest package.
  4. Add the main() method to the TodoAppConfig class and implement by running our application.

The source code of the TodoAppConfig class looks as follows:

package com.javaadvent.bootrest;

import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.EnableAutoConfiguration;
import org.springframework.context.annotation.ComponentScan;
import org.springframework.context.annotation.Configuration;

@Configuration
@EnableAutoConfiguration
@ComponentScan
public class TodoAppConfig {

public static void main(String[] args) {
SpringApplication.run(TodoAppConfig.class, args);
}
}


We have now created the configuration class that configures and runs our Spring Boot application. Because the MongoDB jars are found from the classpath, Spring Boot configures the MongoDB connection by using its default settings.
Additional Reading:

Let’s move on and implement our REST API.

Implementing Our REST API

We need implement a REST API that provides CRUD operations for todo entries. The requirements of our REST API are:

  • A POST request send to the url ‘/api/todo’ must create a new todo entry by using the information found from the request body and return the information of the created todo entry.
  • A DELETE request send to the url ‘/api/todo/{id}’ must delete the todo entry whose id is found from the url and return the information of the deleted todo entry.
  • A GET request send to the url ‘/api/todo’ must return all todo entries that are found from the database.
  • A GET request send to the url ‘/api/todo/{id}’ must return the information of the todo entry whose id is found from the url.
  • A PUT request send to the url ‘/api/todo/{id}’ must update the information of an existing todo entry by using the information found from the request body and return the information of the updated todo entry.

We can fulfill these requirements by following these steps:

  1. Create the entity that contains the information of a single todo entry.
  2. Create the repository that is used to save todo entries to MongoDB database and find todo entries from it.
  3. Create the service layer that is responsible of mapping DTOs into domain objects and vice versa. The purpose of our service layer is to isolate our domain model from the web layer.
  4. Create the controller class that processes HTTP requests and returns the correct response back to the client.

Note: This example is so simple that we could just inject our repository to our controller. However, because this is not a viable strategy when we are implementing real-life applications, we will add a service layer between the web and repository layers.
Let’s get started.

Creating the Entity

We need to create the entity class that contains the information of a single todo entry. We can do this by following these steps:

  1. Add the id, description, and title fields to the created entity class. Configure the id field of the entity by annotating the id field with the @Id annotation.
  2. Specify the constants (MAX_LENGTH_DESCRIPTION and MAX_LENGTH_TITLE) that specify the maximum length of the description and title fields.
  3. Add a static builder class to the entity class. This class is used to create new Todo objects.
  4. Add an update() method to the entity class. This method simply updates the title and description of the entity if valid values are given as method parameters.

The source code of the Todo class looks as follows:

import org.springframework.data.annotation.Id;

import static com.javaadvent.bootrest.util.PreCondition.isTrue;
import static com.javaadvent.bootrest.util.PreCondition.notEmpty;
import static com.javaadvent.bootrest.util.PreCondition.notNull;

final class Todo {

static final int MAX_LENGTH_DESCRIPTION = 500;
static final int MAX_LENGTH_TITLE = 100;

@Id
private String id;

private String description;

private String title;

public Todo() {}

private Todo(Builder builder) {
this.description = builder.description;
this.title = builder.title;
}

static Builder getBuilder() {
return new Builder();
}

//Other getters are omitted

public void update(String title, String description) {
checkTitleAndDescription(title, description);

this.title = title;
this.description = description;
}

/**
* We don't have to use the builder pattern here because the constructed
* class has only two String fields. However, I use the builder pattern
* in this example because it makes the code a bit easier to read.
*/
static class Builder {

private String description;

private String title;

private Builder() {}

Builder description(String description) {
this.description = description;
return this;
}

Builder title(String title) {
this.title = title;
return this;
}

Todo build() {
Todo build = new Todo(this);

build.checkTitleAndDescription(build.getTitle(), build.getDescription());

return build;
}
}

private void checkTitleAndDescription(String title, String description) {
notNull(title, "Title cannot be null");
notEmpty(title, "Title cannot be empty");
isTrue(title.length() <= MAX_LENGTH_TITLE,
"Title cannot be longer than %d characters",
MAX_LENGTH_TITLE
);

if (description != null) {
isTrue(description.length() <= MAX_LENGTH_DESCRIPTION,
"Description cannot be longer than %d characters",
MAX_LENGTH_DESCRIPTION
);
}
}
}


Additional Reading:

Let’s move on and create the repository that communicates with the MongoDB database.

Creating the Repository

We have to create the repository interface that is used to save Todo objects to MondoDB database and retrieve Todo objects from it.
If we don’t want to use the Java 8 support of Spring Data, we could create our repository by creating an interface that extends the CrudRepository<T, ID> interface. However, because we want to use the Java 8 support, we have to follow these steps:

  1. Create an interface that extends the Repository<T, ID> interface.
  2. Add the following repository methods to the created interface:
    1. The void delete(Todo deleted) method deletes the todo entry that is given as a method parameter.
    2. The List<Todo> findAll() method returns all todo entries that are found from the database.
    3. The Optional<Todo> findOne(String id) method returns the information of a single todo entry. If no todo entry is found, this method returns an empty Optional.
    4. The Todo save(Todo saved) method saves a new todo entry to the database and returns the the saved todo entry.

The source code of the TodoRepository interface looks as follows:

import org.springframework.data.repository.Repository;

import java.util.List;
import java.util.Optional;

interface TodoRepository extends Repository<Todo, String> {

void delete(Todo deleted);

List<Todo> findAll();

Optional<Todo> findOne(String id);

Todo save(Todo saved);
}


Additional Reading:

Let’s move on and create the service layer of our example application.

Creating the Service Layer

First, we have to create a service interface that provides CRUD operations for todo entries. The source code of the TodoService interface looks as follows:

import java.util.List;

interface TodoService {

TodoDTO create(TodoDTO todo);

TodoDTO delete(String id);

List<TodoDTO> findAll();

TodoDTO findById(String id);

TodoDTO update(TodoDTO todo);
}


The TodoDTO class is a DTO that contains the information of a single todo entry. We will talk more about it when we create the web layer of our example application.
Second, we have to implement the TodoService interface. We can do this by following these steps:

  1. Inject our repository to the service class by using constructor injection.
  2. Add a private Todo findTodoById(String id) method to the service class and implement it by either returning the found Todo object or throwing the TodoNotFoundException.
  3. Add a private TodoDTO convertToDTO(Todo model) method the service class and implement it by converting the Todo object into a TodoDTO object and returning the created object.
  4. Add a private List<TodoDTO> convertToDTOs(List<Todo> models) and implement it by converting the list of Todo objects into a list of TodoDTO objects and returning the created list.
  5. Implement the TodoDTO create(TodoDTO todo) method. This method creates a new Todo object, saves the created object to the MongoDB database, and returns the information of the created todo entry.
  6. Implement the TodoDTO delete(String id) method. This method finds the deleted Todo object, deletes it, and returns the information of the deleted todo entry. If no Todo object is found with the given id, this method throws the TodoNotFoundException.
  7. Implement the List<TodoDTO> findAll() method. This methods retrieves all Todo objects from the database, transforms them into a list of TodoDTO objects, and returns the created list.
  8. Implement the TodoDTO findById(String id) method. This method finds the Todo object from the database, converts it into a TodoDTO object, and returns the created TodoDTO object. If no todo entry is found, this method throws the TodoNotFoundException.
  9. Implement the TodoDTO update(TodoDTO todo) method. This method finds the updated Todo object from the database, updates its title and description, saves it, and returns the updated information. If the updated Todo object is not found, this method throws the TodoNotFoundException.

The source code of the MongoDBTodoService looks as follows:

import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Service;

import java.util.List;
import java.util.Optional;

import static java.util.stream.Collectors.toList;

@Service
final class MongoDBTodoService implements TodoService {

private final TodoRepository repository;

@Autowired
MongoDBTodoService(TodoRepository repository) {
this.repository = repository;
}

@Override
public TodoDTO create(TodoDTO todo) {
Todo persisted = Todo.getBuilder()
.title(todo.getTitle())
.description(todo.getDescription())
.build();
persisted = repository.save(persisted);
return convertToDTO(persisted);
}

@Override
public TodoDTO delete(String id) {
Todo deleted = findTodoById(id);
repository.delete(deleted);
return convertToDTO(deleted);
}

@Override
public List findAll() {
List todoEntries = repository.findAll();
return convertToDTOs(todoEntries);
}

private List convertToDTOs(List models) {
return models.stream()
.map(this::convertToDTO)
.collect(toList());
}

@Override
public TodoDTO findById(String id) {
Todo found = findTodoById(id);
return convertToDTO(found);
}

@Override
public TodoDTO update(TodoDTO todo) {
Todo updated = findTodoById(todo.getId());
updated.update(todo.getTitle(), todo.getDescription());
updated = repository.save(updated);
return convertToDTO(updated);
}

private Todo findTodoById(String id) {
Optional result = repository.findOne(id);
return result.orElseThrow(() -> new TodoNotFoundException(id));

}

private TodoDTO convertToDTO(Todo model) {
TodoDTO dto = new TodoDTO();

dto.setId(model.getId());
dto.setTitle(model.getTitle());
dto.setDescription(model.getDescription());

return dto;
}
}


We have now created the service layer of our example application. Let’s move on and create the controller class.

Creating the Controller Class

First, we need to create the DTO class that contains the information of a single todo entry and specifies the validation rules that are used to ensure that only valid information can be saved to the database. The source code of the TodoDTO class looks as follows:

import org.hibernate.validator.constraints.NotEmpty;

import javax.validation.constraints.Size;

public final class TodoDTO {

private String id;

@Size(max = Todo.MAX_LENGTH_DESCRIPTION)
private String description;

@NotEmpty
@Size(max = Todo.MAX_LENGTH_TITLE)
private String title;

//Constructor, getters, and setters are omitted
}


Additional Reading:

Second, we have to create the controller class that processes the HTTP requests send to our REST API and sends the correct response back to the client. We can do this by following these steps:

  1. Inject our service to our controller by using constructor injection.
  2. Add a create() method to our controller and implement it by following these steps:
    1. Read the information of the created todo entry from the request body.
    2. Validate the information of the created todo entry.
    3. Create a new todo entry and return the created todo entry. Set the response status to 201.
  3. Implement the delete() method by delegating the id of the deleted todo entry forward to our service and return the deleted todo entry.
  4. Implement the findAll() method by finding the todo entries from the database and returning the found todo entries.
  5. Implement the findById() method by finding the todo entry from the database and returning the found todo entry.
  6. Implement the update() method by following these steps:
    1. Read the information of the updated todo entry from the request body.
    2. Validate the information of the updated todo entry.
    3. Update the information of the todo entry and return the updated todo entry.
  7. Create an @ExceptionHandler method that sets the response status to 404 if the todo entry was not found (TodoNotFoundException was thrown).

The source code of the TodoController class looks as follows:

import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.http.HttpStatus;
import org.springframework.web.bind.annotation.ExceptionHandler;
import org.springframework.web.bind.annotation.PathVariable;
import org.springframework.web.bind.annotation.RequestBody;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RequestMethod;
import org.springframework.web.bind.annotation.ResponseStatus;
import org.springframework.web.bind.annotation.RestController;

import javax.validation.Valid;
import java.util.List;

@RestController
@RequestMapping("/api/todo")
final class TodoController {

private final TodoService service;

@Autowired
TodoController(TodoService service) {
this.service = service;
}

@RequestMapping(method = RequestMethod.POST)
@ResponseStatus(HttpStatus.CREATED)
TodoDTO create(@RequestBody @Valid TodoDTO todoEntry) {
return service.create(todoEntry);
}

@RequestMapping(value = "{id}", method = RequestMethod.DELETE)
TodoDTO delete(@PathVariable("id") String id) {
return service.delete(id);
}

@RequestMapping(method = RequestMethod.GET)
List<TodoDTO> findAll() {
return service.findAll();
}

@RequestMapping(value = "{id}", method = RequestMethod.GET)
TodoDTO findById(@PathVariable("id") String id) {
return service.findById(id);
}

@RequestMapping(value = "{id}", method = RequestMethod.PUT)
TodoDTO update(@RequestBody @Valid TodoDTO todoEntry) {
return service.update(todoEntry);
}

@ExceptionHandler
@ResponseStatus(HttpStatus.NOT_FOUND)
public void handleTodoNotFound(TodoNotFoundException ex) {
}
}


Note: If the validation fails, our REST API returns the validation errors as JSON and sets the response status to 400. If you want to know more about this, read a blog post titled: Spring from the Trenches: Adding Validation to a REST API.
That is it. We have now created a REST API that provides CRUD operations for todo entries and saves them to MongoDB database. Let’s summarize what we learned from this blog post.

Summary

This blog post has taught us three things:

  • We can get the required dependencies with Maven by declaring only two dependencies: spring-boot-starter-web and spring-data-mongodb.
  • If we are happy with the default configuration of Spring Boot, we can configure our web application by using its auto-configuration support and “dropping” new jars to the classpath.
  • We learned to create a simple REST API that saves information to MongoDB database and finds information from it.

You can get the example application of this blog post from Github.
This post is part of the Java Advent Calendar and is licensed under the Creative Commons 3.0 Attribution license. If you like it, please spread the word by sharing, tweeting, FB, G+ and so on!

Recompiling the Java runtime library with debug symbols

The JDK already comes with the sources for the Java runtime library (also known as “rt.jar”) and also the class files are compiled with “LineNumberTable” which makes it possible to set breakpoints on a specified line. However the class files are not compiled with “LocalVariableTable” which means that even when you’re stopped at a breakpoint, you can only look at the method parameters, not at the local variables (hint you can use javap to check if a certain class file contains LineNumberTable, LocalVariableTable, both or neither). Probably this was done to save space, which is understandable in the case of the JRE, but for the JDK it’s annoying.

Fortunately you can fix this! Just use the ant script at the end of this post (or get it here) to build rt_debug.jar. After having built it, you have multiple options to make use of it:

  • Put it in your jre/lib/endorsed directory (create it if doesn’t exists)
  • Specify it using -Xbootclasspath
  • Override jre/lib/rt.jar with it (after making a backup of course!)

I personally used the first method. The only downside is that classes appear twice in the Eclipse type explorer (once for rt.jar and once for rt_debug.jar. Perhaps in the future we will get the rt.jar for JDK compiled with full debug symbols by default.

Final note: the ant script relies on having the JAVA_HOME variable available, so you might need to specify it by hand in case it isn’t set. For example on Ubuntu 14.10 the full command to be run is:

JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64 ant -f debug_jdk.xml

Ant script:

Credits for inspiration:

This post is part of the Java Advent Calendar and is licensed under the Creative Commons 3.0 Attribution license. If you like it, please spread the word by sharing, tweeting, FB, G+ and so on!

JDK 9 – a letter to Santa?!

As everybody knows, winter (especially the time before Christmas) is a proper time for dreaming and hoping a moment when dreams seem to be touchable. A moment when children and grown-ups write on paper or in their thoughts fictive or real letters to Santa Claus, hoping their dreams will become reality. This is catchy, as even the people behind OpenJDK expressed their wishes for the (of java) on the first day of December, when they published an updated list JEPs. Hold on, don’t get excited just yet…as we bitterly know, they will might become reality somewhere in early 2016. Or at least this is the plan, and history showed us what sticking to a plan means :).
Of course, the presence of a JEP in the above mentioned list, doesn’t mean that the final release will contain it as the JEP process diagram clearly explains, but for the sake of winter fairy tails we will go through the list and provide a brief description what the intended purpose of each item is.

Disclaimer: the list of JEPs is a moving target, since the publication of this article the list changed at least once.


Those of you who where lucky not that good,  it seems that santa punished you and you had the pleasure of working with java’s process api and of course met his limitations. After the changes in JDK 7, the current JEP comes to improve this API even further and to give us the ability to:
  • to get the pid (or equivalent) of the current Java virtual machine and the pid of processes created with the existing API
  • to get/set the process name of the current Java virtual machine and processes created with the existing API (where possible)
  • to enumerate Java virtual machines and processes on the system. Information on each process may include its pid, name, state, and perhaps resource usage
  • to deal with process trees, in particular some means to destroy a process tree
  • to deal with hundreds of sub-processes, perhaps multiplexing the output or error streams to avoid creating a thread per sub-process
I don’t know about you, but I can definitely find at least a couple of scenarios where I could put at good use some of this features, so fingers crossed.


I had the luck and pleasure to be present to a performance workshop the other days with Peter Lawrey, and one of the thumb rules of java performance tuning was: the least concurrent an application, the more performant it is. With this improvement in place, the rules of performance tuning might need to find another thumb rule, as with this JEP implemented the latency of using monitors in java is targeted. To be more accurate, the targets are:

  • Field reordering and cache line alignment
  • Speed up PlatformEvent::unpark()
  • Fast Java monitor enter operations
  • Fast Java monitor exit operations
  • Fast Java monitor notify/notifyAll operations
  • Adaptive spin improvements and SpinPause on SPARC


The title kind of says it all :). If you are working with enterprise applications you had to deal at least once or twice with a gc log and I suppose raised at least an eyebrow (if not both) when seeing the amount of information and the way it was presented there. Well, if you were “lucky” enough you probably migrated between JVM versions, and then definitely wanted/needed another two eyebrows to raise when you realised that the parsers you’ve built for the previous version has issues dealing with the current version of the JVM logging. I suppose I can continue with why is bad, but let’s concentrate on the improvements, so hopefully by the next release we will have a reason to complain that before it was better :P.

The gc logging seems to try to align with the other logging frameworks we might be used too like log4j. So, it will work on different levels from the perspective of the logged information’s criticality (error, warning, info, debug, trace) their performance target being that error and warning not to have any performance impact on production environments, info suitable for production environments, while debug and trace don’t have any performance requirements. A default log line will look as follows: 
[gc][info][6.456s] Old collection complete

In order to ensure flexibility the logging mechanisms will be tuneable through JVM parameters, the intention being to have a unified approach to them. For backwards compatibility purposes, the already existing JVM flags will be mapped to new flags, wherever possible.
To be as suitable as possible for realtime applications, the logging can be manipulated through jcmd command or MBeans.
The sole and probably the biggest downside of this JEP is that it targets only providing the logging mechanisms and doesn’t necessarily mean that the logs will also improve. For having the beautiful logs we dream of maybe we need to wait a little bit more.


As you probably know, the java platform uses JIT compilers to ensure an optimum run of the written application. The two existing compilers intuitively named C1 and C2, correspond to client(-client option) respectively server side application (-server option). The expressed goals of this JEP is to increase the manageability of these compilers:

  • Fine-grained and method-context dependent control of the JVM compilers (C1 and C2).
  • The ability to change the JVM compiler control options in run time.
  • No performance degradation.


  • It seems that JVM performance is targeted in the future java release, as the current JEP is intended to optimise the code cache. The goals are:

    • Separate non-method, profiled, and non-profiled code
    • Shorter sweep times due to specialized iterators that skip non-method code
    • Improve execution time for some compilation-intensive benchmarks
    • Better control of JVM memory footprint
    • Decrease fragmentation of highly-optimized code
    • Improve code locality because code of the same type is likely to be accessed close in time
      • Better iTLB and iCache behavior
    • Establish a base for future extensions
      • Improved management of heterogeneous code; for example, Sumatra (GPU code) and AOT compiled code
      • Possibility of fine-grained locking per code heap
      • Future separation of code and metadata (see JDK-7072317)
    The first two declared goals, are from my perspective quite exciting, with the two in place the sweep times of the code cache can be highly improved by simply skiping the non-method areas – areas that should exist on the entire runtime of the JVM.


    The presence of this improvement shouldn’t be a surprise, but for me it is surprising that it didn’t make sooner in the JDK, as JSON replaced XML as the “lingua-franca” of the web, not only for reactive JS front-ends but also for structuring the data in NoSQL databases. The declared goals of this JEP are:

    • Parsing and generation of JSON RFC7159.
    • Functionality meets needs of Java developers using JSON.
    • Parsing APIs which allow a choice of parsing token stream, event (includes document hierarchy context) stream, or immutable tree representation views of JSON documents and data streams.
    • Useful API subset for compact profiles and Java ME.
    • Immutable value tree construction using a Builder-style API.
    • Generator style API for JSON data stream output and for JSON “literals”.
    • A transformer API, which takes as input an existing value tree and produces a new value tree as result.

    Also, the intention is to align with JSR 353. Even if the future JSON will have limited functionalities comparing to the already existing libraries, it has the competitive advantage of integrating and using the newly added features from JDK 8 like streams and lambdas.



    The sjavac is a wrapper to the already famous javac, a wrapper intended to bring improved performance when compiling big sized projects. As in the current phase, the project has stability and portability issues, the main goal is to fix the given issues and to probably make it the default build tool for the JDK project. The stretched goal would be to make the tool ready to use for projects other than JDK and probably integration with the existing toolchain. 



    The first steps in the direction of project jigsaw’s implementation, having the intention of reorganising the source code as modules enhancing the build tool for module building and respecting the module boundaries. 



    The goal of this JEP is to facilitate making large code bases clean of lint warnings. The deprecation warnings on imports cannot be suppressed using the@SuppressWarnings annotation, unlike uses of deprecated members in code. In large code bases like that of the JDK, deprecated functionality must often be supported for some time and merely importing a deprecated construct does not justify a warning message if all the uses of the deprecated construct are intentional and suppressed.



    As the lunch date for the JDK 9 is early 2016, this JEP is perfect for that time of the year and the corresponding chores: the spring clean-up. The main goal of it is to have a clean compile under javac’s lint option (-Xlint:all) for at least the fundamental packages of the platform.



    Project coin’s target starting with JDK 7 was to bring some syntactic sugar in the java language, to bring some new clothes on the mature platform. Even if it didn’t bring any improvements to the performance of the language, it increased the readability of the code hence it brought a plus to one of the most important assets of a software project, in my opinion, a more readable code base.
    This JEP targets four changes:

    1. Complete the removal, begun in Java SE 8, of underscore from the set of legal identifier names.


    Spring time cleaning continues with the removal of the JVM flags deprecated in Java 8 release, so with release 9 the following options will no longer be supported:


    DefNew + CMS : -XX:-UseParNewGC -XX:+UseConcMarkSweepGC

    ParNew + SerialOld : -XX:+UseParNewGC

    ParNew + iCMS : -XX:+CMSIncrementalMode -XX:+UseConcMarkSweepGC

    ParNew + iCMS : -Xincgc

    DefNew + iCMS : -XX:+CMSIncrementalMode -XX:+UseConcMarkSweepGC -XX:-UseParNewGC

    CMS foreground : -XX:+UseCMSCompactAtFullCollection
    CMS foreground : -XX:+CMSFullGCsBeforeCompaction

    CMS foreground : -XX:+UseCMSCollectionPassing



    This JEP targets to Fix javac to properly accept and reject programs regardless of the order of importstatements and extends and implements clauses.


    The increasing number of application layer protocols have been designed that use UDP transport,in particular, protocols such as the Session Initiation Protocol (SIP) and electronic gaming protocols made security concerns higher than ever especially since TLS can be used only over reliable protocols like TCP. The current JEP intends to fill this gap by defining an API for Datagram Transport Layer Security (DTLS) version 1.0 (RFC 4347) and 1.2 (RFC 6347).



    Comes as a follow-up step to JEP 201, with the intention to restructure the JDK and run-time environment to accommodate modules and to improve performance, security, and maintainability. Define a new URI scheme for naming the modules, classes, and resources stored in a run-time image without revealing the internal structure or format of the image. Revise existing specifications as required to accommodate these changes.


    As the HTML standard version reached version 5, the javadoc pages of the JDK need to keep up the pace as well, hence upgrade from HTML 4.01.



    Remove the ability to request(by using -version:), at JRE launch time, a version of the JRE that is not the JRE being launched. The removal will be done stepwise: a warning will be emitted in version 9 while Java 10 will probably throw an error.

    This is the current form of the list of enhancements prepared for JDK 9, to be honest when I first looked over it, I was somehow blue but after reading more into it I became rather excited as it seems that java is yet to start the road for another adventure and they need all the help they could get. So if you want to get involved(please do 😉 ), a later blog post of the java advent series will present you how to get involved. Imagine it like the fellow ship of the ring, but target of the adventure is building java not destroying the ring…who might Mr. Frodo be?

    This post is part of the Java Advent Calendar and is licensed under the Creative Commons 3.0 Attribution license. If you like it, please spread the word by sharing, tweeting, FB, G+ and so on!

    Java Advent Calendar 2014

    Hello everybody,
    I meet with excitement this time of the year and the 2014 edition of the Java Advent Calendar. Unfortunately, this year’s schedule will suffer delay’s and we we apologize for this, but rest assured that the content and the posters will be at the high level everybody is already used to. So, the next post of the series will be posted somewhere by the end of this week, and from this point on we hope everything will go on as expected. 
    I am quite excited by the posters line up, some of them being already veterans of this project (they posted in every other edition of the calendar), others are for the first time with us, but rest assured the topics are interesting and quite varied. I will not spoil the surprise, but let me just say this: Christmas will be met in a traditional fashion.
    I hope you will enjoy this year’s series of articles and once more I apologize for the delay.
    I wish you a peaceful and happy Advent time!

    2013 – The Java/JVM Year in Review and a Look to the Future!

    Hi all, It’s been an amazing year for Java/JVM developers!

    For me its been a year of fine tuning the Adopt OpenJDK and Adopt a JSR programmes with the LJC and working hard at jClarity – The Java/JVM performance analysis company that I’ve now been demoted to CEO of ;-).

    In terms of the overall ecosystem there’s been some great steps forward:

    1. The language and platform continues to evolve (e.g. Java 8 is looking in great shape!).
    2. The standards body continues to become more inclusive and open (e.g. More Java User Groups have joined the Executive Committee of the JCP).
    3. The community continues to thrive (record numbers of conferences, Java User Groups and GitHub commits).

    As is traditional, I’ll try and take you through some of the trends of the past year that was and a small peek into the future as well.

    2013 Trends

    There were some major trends in the Java/JVM and software development arena in 2013. Mobile Apps have become the norm, virtualised infrastructure is everywhere and of course Java is still growing faster (in absolute terms) than any other language out there except for Javascript. Here are some other trends worth noting.

    Functional Programming

    Mainstream Java developers have adopted Scala, Groovy, Clojure and other JVM languages to write safer, more concise code for call back handlers, filters, reduces and a host of other operations on collections and event handlers. A massive wave of developers will of course embrace Java 8 and Lambdas when it comes out in early 2014.

    Java moving beyond traditional web apps

    In 2013 Java firmly entrenched itself into areas of software development outside of the traditional 3-tier web application. It is used realm of NoSQL (e.g. Hadoop), High performance Messaging systems (e.g. LMAX Disruptor), Polyglot Actor-Like Application Frameworks (e.g. Vert.x) and much much more.

    HTML 5

    2013 has started to see the slow decline of Java/JVM serverside templating systems as the rise of Javascript libraries and micro frameworks (e.g. AngularJS, Twitter Bootstrap) means that developers can finally have a client side tempalting framework that doesn’t suck and has real data binding to JSON/XML messages going to and from the browser. Websockets and asynchronous messaging are also nicely taken care of by the likes of SockJS.

    2014+ – The Future

    Java 8 – Lambdas

    More than enough has been written about these, so I’ll offer up a practical tutorial and the pragmatic Java 8 Lambdas Book for 2014 that you should be buying!

    Java 9/10 – VM improvements

    Java 9 and 10 are unlikely to bring many changes to the language itself, apart from applying Java 7 and 8 features to the internal APIs, making the overall Java API a much more pleasant experience to use.

    So what is coming? Well the first likely major feature will be some sort of packed object or value type representation. This is where you will be able to defined a Java ‘object’ as being purely a data type. What do I mean by this? Well it means that the ‘object’ will be able to be laid out in memory very efficiently and not have the overhead of hashCode, equals or other regalia that comes with being a Java Object. Think of it a little bit like a C/C++ struct. Here’s an example of what it might look like in source code:


    @Packed
    final class PackedPoint extends PackedObject {
    int x, y;
    }

    And the native representation:


    // Native representation
    struct Point {
    int x, y;
    }

    There are loads of performance advantages to this, especially for immutable, well defined data structures such as points on a graph or within a 3D model. The JVM will be able to use far more efficient housekeeping mechanisms with respects to Memory Layout, Lookups, JIT and Garbage Collection.

    Some sort of Reified generics will also come into the picture, but it’s more likely to be an extension of the work done of the ‘packed object’ idea and removing the need to unnecessary overhead in maintaining and the manipulation of collections that use the Object representation of primitives (e.g. List).

    Java 9/10 – ME/SE convergence

    It was very clear at JavaOne this year that Oracle are going to try and make Java a player in the IoT space. The opportunuites are endless, but the JVM has some way to go to find the right balance bewtwen being able to work on really small decvices (which is where Java ME and CDLC sit today) and providing a rich language features and a full stack for Java developers (which is where Java SE and EE sit today).

    The merger between ME and SE begins in earnest for Java 8 and will hopefully be completed by Java 9. I encourage all of oyu to try out early versions of this on your Raspberry Pi!

    But wait there’s more!

    The way Java itself is being build and maintained is changing itself with a move towards more open development at OpenJDK. Features such as Tail Call Recursion and Co-routines are being discussed and there is a port on the way to have Java natively work on graphics cards.

    There’s plenty to challenge yourself with and also have a lot of fun in 2014! Me? I’m going to try and get Java running on my Pi and maybe build that AngularJS front end for controlling hardware in my house…..

    Happy Holidays everyone!

    Martijn (@karianna) Java Champion, Speaker, Author, Cat Herder etc

    Meta: this post is part of the Java Advent Calendar and is licensed under the Creative Commons 3.0 Attribution license. If you like it, please spread the word by sharing, tweeting, FB, G+ and so on!

    Under the JVM hood – Classloaders

    By Simon Maple, @sjmapleZeroTurnaround Technical Evangelist

    Classloaders are a low level and often ignored aspect of the Java language among many developers. At ZeroTurnaround, our developers have had to live, breathe, eat, drink and almost get intimate with classloaders to produce the JRebel technology which interacts at a classloader level to provide live runtime class reloading, avoiding lengthy rebuilds/repackaging/redeploying cycles. Here are some of the things we’ve learnt around classloaders including some debugging tips which will hopefully save you time and potential headdesking in the future.

    A classloader is just a plain java object

    Yes, it’s nothing clever, well other than the system classloader in the JVM, a classloader is just a java object! It’s an abstract class, ClassLoader, which can be implemented by a class you create. Here is the API:

    public abstract class ClassLoader {

    public Class loadClass(String name);

    protected Class defineClass(byte[] b);

    public URL getResource(String name);

    public Enumeration getResources(String name);

    public ClassLoader getParent()

    }

    Looks pretty straightforward, right? Let’s take a look method by method. The central method is loadClass which just takes a String class name and returns you the actual Class object. This is the method which if you’ve used classloaders before is probably the most familiar as it’s the most used in day to day coding. defineClass is a final method in the JVM that takes a byte array from a file or a location on the network and produces the same outcome, a Class object.

    A classloader can also find resources from a classpath. It works in a similar way to the loadClass method. There are a couple of methods, getResource and getResources, which return a URL or an Enumeration of URLs which point to the resource which represents the name passed as input to the method.

    Every classloader has a parent; getParent returns the classloaders parent, which is not Java inheritance related, rather a linked list style connection. We will look into this in a little more depth later on.

    Classloaders are lazy, so classes are only ever loaded when they are requested at runtime. Classes are loaded by the resource which invokes the class, so a class, at runtime, could be loaded by multiple classloaders depending on where they are referenced from and which classloader loaded the classes which referen… oops, I’ve gone cross-eyed! Let’s look at some code.

    public class A {

    public void doSmth() {

    B b = new B();

    b.doSmthElse();

    }

    }

    Here we have class A calling the constructor of class B within the doSmth of it’s methods.  Under the covers this is what is happening

    A.class.getClassLoader().loadClass(“B”);

    The classloader which originally loaded class A is invoked to load the class B.

    Classloaders are hierarchical, but like children, they don’t always ask their parents

    Every classloader has a parent classloader. When a classloader is asked for a class, it will typically go straight to the parent classloader first calling loadClass which may in turn ask it’s parent and so on. If two classloaders with the same parent are asked to load the same class, it would only be done once, by the parent. It gets very troublesome when two classloaders load the same class separately, as this can cause problems which we’ll look at later.

    When the JEE spec was designed, the web classloader was designed to work the opposite way – great. Let’s take a look at the figure below as our example.  
     


    Module WAR1 has its own classloader and prefers to load classes itself rather than delegate to it’s parent, the classloader scoped by App1.ear. This means different WAR modules, like WAR1 and WAR2 cannot see each others classes. The App1.ear module has its own classloader and is parent to the WAR1 and WAR2 classloaders.  The App1.ear classloader is used by the WAR1 and WAR2 classloaders when they needs to delegate a request up the hierarchy i.e. a class is required outside of the WAR classloader scope. Effectively the WAR classes override the EAR classes where both exist. Finally the EAR classloader’s parent is the container classloader.  The EAR classloader will delegate requests to the container classloader, but it does not do it in the same way as the WAR classloader, as the EAR classloader will actually prefer to delegate up rather than prefer local classes. As you can see this is getting quite hairy and is different to the plain JSE class loading behaviour.

    The flat classpath

    We talked about how the system classloader looks to the classpath to find classes that have been requested. This classpath could include directories or JAR files and the order which they are looked through is actually dependant on the JVM you are using. There may be multiple copies or versions of the class you require on the classpath, but you will always get the first instance of the class found on the classpath.  It’s essentially just a list of resources, which is why it’s referred to as flat. As a result the classpath list can often be relatively slow to iterate through when looking for a resource.

    Problems can occur when applications who are using the same classpath want to use different versions of a class, lets use Hibernate as an example. When two versions of Hibernate JARs exist on the classpath, one version cannot be higher up the classpath for one application than it is for the other, which means both will have to use the same version. One way around this is to bloat the application (WAR) with all the libraries necessary, so that they use their local resources, but this then leads to big applications which are hard to maintain. Welcome to JAR hell! OSGi provides a solution here as it allows versioning of JAR files, or bundles, which results in a mechanism to allow wiring to particular versions of JAR files avoiding the flat classpath problems.

    How do I debug my class loading errors?

    NoClassDefFoundError/ClassNotFoundException/ClassNoDefFoundException?

     

    So, you’ve got an error/exception like the ones above. Well, does the class actually exist? Don’t bother looking in your IDE, as that’s where you compiled your class, it must be there otherwise you’ll get a compile time exception. This is a runtime exception so it’s in the runtime we want to look for the class which it says we’re missing… but where do you start? Consider the following piece of code…

    Arrays.toString((((URLClassLoader) Test.class.getClassLoader())
    .getURLs()));

    This code returns an array list of all jars and directories on the classpath of the classloader the class Test is using. So now we can see if the JAR or location our mystery class should exist in is actually on the classpath. If it does not exist, add it! If it does exist, check the JAR/directory to make sure your class actually exists in that location and add it if it’s missing. These are the two typical problems which result in this error case.

    NoSuchMethodError/NoSuchFieldError/AbstractMethodError/IllegalAccessError?

     

    Now it’s getting interesting! These are all subclasses of the IncompatibleClassChangeError. We know the classloader has found the class we want (by name), but clearly it hasn’t found the right version. Here we have a class called Test which is making an invocation to another class, Util, but BANG – We get an exception! Lets look at the next snippet of code to debug:

    Test.class.getClassLoader().getResource(Util.class.getName()
    .replace('.', '/') + ".class");

    We’re calling getResource on the classloader of class Test. This returns us the URL of the Util resource. Notice we’ve replaced the ‘.’ with a ‘/’ and added a ‘.class’ at the end of the String. This changes the package and classname of the class we’re looking for (from the perspective of the classloader) into a directory structure and filename on the filesystem – neat. This will show us the exact class we have loaded and we can make sure it’s the correct version. We can use javap -private on the class at a command prompt to see the byte code and check which methods and fields actually exist. You can easily see the structure of the class and validate whether it’s you or the Java runtime which is going crazy! Believe me, at one stage or another you’ll question both, and nearly every time it will be you! :o)

    LinkageError/ClassCastException/IllegalAccessError

     

    These can occur if two different classloaders load the same class and they try to interact… ouch! Yes, it’s now getting a bit hairy. This can cause problems as we do not know if they will load the classes from the same place. How can this happen? Lets look at the following code, still in the Test class:

    Factory.instance().sayHello();

    The code looks pretty clean and safe, and it’s not clear how an error could emerge from this line. We’re calling a static factory method to get us an instance of the Test class and are invoking a method on it. Lets look at this supporting image to show the reason why an exception is being thrown.


    Here we can see a web classloader (which loaded the Test class) will prefer local classes, so when it makes reference to a class, it will be loaded by the web classloader, if possible. Fairly straightforward so far.  The Test class uses the Factory class to get hold of an instance of the Util class which is fairly typical practice in Java, but the Factory class doesn’t exist in the WAR as it is an external library.  This is no problem as the web classloader can delegate to the shared classloader, which can see the Factory class. Note that the shared classloader is now loading it’s own version of the Util class as when the Factory instantiates the class, it uses the shared classloader (as shown in the first example earlier). The Factory class returns the Util object (created by the shared classloader) back to the WAR, which then tries to use the class, and effectively cast the class to a potentially different version of the same class (the Util class visible to the web classloader). BOOM!

    We can run the same code as before from within both places (The Factory.instance() method and the Test class) to see where each of our Util classes are being loaded from.

    Test.class.getClassLoader().getResource(Util.class.getName()
    .replace('.', '/') + ".class"));

    Hopefully this has given you an insight into the world of classloading, and instead of not understanding the classloader, you can now appreciate it with a hint of fear and uncertainty! Thanks for reading and making it to the end. We’d all like to wish you a Merry Christmas and a happy new year from ZeroTurnaround!  Happy coding!
     
    Meta: this post is part of the Java Advent Calendar and is licensed under the Creative Commons 3.0 Attribution license. If you like it, please spread the word by sharing, tweeting, FB, G+ and so on! Want to write for the blog? We are looking for contributors to fill all 24 slot and would love to have your contribution! Contact Attila Balazs to contribute!