EJB 3.0 Performance Measurements

By Raghu Kodali

The Enterprise JavaBeans (EJB) specification has evolved over the last few years. Before EJB 2.1, application developers faced a burden of overwhelming complexity: They had to manage several component interfaces, deployment descriptors, and unnecessary callback methods; work within the limitations of the EJB Query Language (EJBQL); and learn and implement the design patterns used to overcome the limitations of the specification.

Although EJB 2.1 did improve things, many still say the specification is too complex - and that criticism is often seen as a reflection of the problems of the entire J2EE platform.

The theme for the Java EE 5 platform is ease of development, and reducing the complexity of EJB development is at its core. The EJB 3.0 specification simplifies development by removing the requirements for interfaces, deployment descriptors, and callback methods and by adopting regular Java Classes and business interfaces as EJBs.

The specification also leverages metadata annotations that are standardized with JSR-175, and the proven Plain old Java object (POJO) persistence architecture used by Object-Relational (O/R) frameworks such as Oracle TopLink and Hibernate. These last two features have greatly reduced much of the specification's complexity. Now you can take a regular Java class, add annotations to it, and deploy it to an EJB 3.0 container as an entity.

With all the above simplicity and power of the EJB 3.0 specification, we can't help but think that performance must be rather poor. After all, all that simplicity must come at a price. With this in mind, we set out to test EJB 3.0's performance using Oracle's implementation of the specification.

We thought that the best way to test was to do a few things that developers typically do with EJB 2.1 and then try out the equivalent with EJB 3.0. This would let us compare performance rather than deal with raw numbers that probably wouldn't mean much. We tested the Data Transfer Object (DTO) design pattern, the Session fašade design pattern, and the use of the Container Managed Relationships (CMR) functionality of the entity beans.

The application we used, as well as the test harness and the methodology, is described in ?J2EE Performance Testing? by Peter Zadrozny (Expert Press, 2002). The harness is a simple dispatcher servlet that executes each discrete test case based on the JazzCat application, a catalog of jazz recordings. The database schema includes tables for bands, musicians, instruments, tracks, and albums. The application also handles the storage and retrieval sessions recording and takes of a track.

The users were simulated with The Grinder 3.0 Beta 25, with a sample size of 5,000 milliseconds. Each test run lasted 8 minutes; we ignored the first 3 minutes to allow the test to stabilize. Each simulated user ran a test script that called the corresponding test case 10 times. The test scripts were executed continuously in a sequential fashion for the duration of the test run. Two things are worth noting: There was a separate HTTP session for each execution of the test scripts, and there was no sleep time between each call to the test case. The latter was done to create a highly stressful situation so we could see how EJB 3.0 behaved when pushed to its limits.

To get a complete picture of the performance, we used two key indicators: Aggregate Average Response Time (AART), to reflect the end user perspective, and Total Transactional Rate (TTR) or throughput, to reflect the load on the systems involved.

We compared three different test cases

  1. With and without DTO's in EJB 2.1 application against EJB 3.0
  2. Using Session Facade in EJB 2.1 application against a POJO based Session facade in EJB 3.0
  3. Usage of Container Managed Relationships (CMR) in EJB 2.1 against POJO based getter's

The results were as follows:

With and without DTO's in EJB 2.1 application against EJB 3.0

We ran the tests with 15, 25, 50, 100, 150, 200, and 250 simultaneous users and we found that in both cases, response time and throughput, the difference between EJB 2.1 using DTO and EJB 3.0 was within a 4%. This is within the margin of error, so for all practical purposes the performance is similar. To give you an idea of the actual performance, with 250 simultaneous users the average aggregate response time was 1,100 milliseconds and the throughput was an average 225 requests per second. This is very impressive, especially since there was no sleep time in the test scripts.

The DTO Off test case produced an AART 28% higher than DTO On and EJB 3, and a TTR that was 19% lower. EJB 3 holds on a little longer by presenting 12% better throughput than DTO On with 250 users. This is likewise for the response time, where EJB 3 has 13% better number than DTO On with 250 users. So, you can say that under heavy loads, EJB 3.0 tends to perform better than EJB 2.1, at least in this test case.

Using Session Facade in EJB 2.1 application against a POJO based Session facade in EJB 3.0

We conducted these tests with 15, 25, 50, 75, and 100 simultaneous users. The test script did 10 different searches starting with the letter a to the letter j. To our surprise, we found that EJB 3.0 had roughly double the performance than EJB 2.1. Let's look at the response time in Figure 2. We can see that as the load increases, the difference in response time increases dramatically, from 18% for 15 users all the way up to 58% for 100 users, giving an overall average of 46%.

Usage of Container Managed Relationships (CMR) in EJB 2.1 against POJO based getter's

The performance of EJB 3.0 was roughly double that of EJB 2.1. The results for the response time looks very similar to that of the Session Fašade test case. The difference starting with 20% for 5 users and pretty much jumps to 55% for the rest of the user load.

Conclusions

Considering that we used only a developer's preview of EJB 3.0, we're very impressed with this specification ? at least Oracle's implementation of it. The wonderful work done with annotations and the persistence, the ability to use POJOs, and the ability to test outside of the container are very attractive all by themselves. But now, with an implementation that equals or doubles the performance of EJB 2.1 (at least for our test cases), we think that EJB 3.0 specification is right on the track.