Calculating E2E no-JVM test code coverage with JCov

  1. What on earth is dynamic instrumentation code coverage and why you may want it?

  2. Java code coverage tools, like these embedded in IDEs or provided as CI environments plugins are great, but they have one limitation – the tests you run have to also be written in Java or other JVM language. What if you have suites of tests in other, non JVM languages and would like to know what is covered and what is not?

    I’ve faced such an issue – we had a really big suite of e2e REST API tests written in Python, and executed them against big Java application running on Tomcat. We wanted to track, where do these tests go in the code. But how to check it?
    Possible solution is so called dynamic instrumentation of Java code using Java agents. Shortly speaking Java agent triggers in the moment of classloading and transforms class bytecode to save some statistics about executed lines. Magic applied in practice.

    According to my research there are two reasonable solutions providing (among other features) the dynamic instrumentation :

    • Jacoco – having an agent mode; recognizable by some people (4090 Stack Overflow matches as of February 2019 ) , not necessarily due to it’s capabilities for outside-java-triggered execution, but for its Maven plugins integrated with Jenkins plugins, Sonar plugins, etc. I used it in some of previous projects to observe code coverage from release to release, so it was my first shot. Disappointingly for this use case it didn’t work well. More on it in the Why not Jacoco section
    • JCov – having an agent mode; developed as a tool targeted at Java Java developers (not a typo, I mean people actually developing Java, yep). Probably originating in the depths of Oracle basements. Known by a small group of unicorns (9 Stack Overflow posts in three years and one reasonable presentation). You have to checkout and build it yourself. I went there, I’m alive, and I’m coming with a practical tutorial.

    I used it for generating code coverage reports while running e2e Python tests, but of course with such flexible agent you can track coverage of the code called anyhow you want. Postman suite? No problem. Clicking around the GUI wondering how the heck it works inside without staring at the debugger? Here you go.

  3. Dynamic vs static instrumentation

  4. As I mentioned before, the instrumentation (process of altering the bytecode of examined classes) may be dynamic, and this is the flavour I used, but it can also be static. With static instrumentation you mutate the class files before you run them to gather coverage statistics. In comparison to dynamic instrumentation it makes execution faster (as there is no overhead on classloading) and it consumes much less memory resources in general. JCov also supports static instrumentation, I haven’t tried it though, as dynamic mode was more suitable for me. You can find more information on static instrumentation mode in the linked JavaOne presentation from JCov developers themselves and also in the indispensable verbose help mode built in the jcov.jar

  5. Setup and operating instructions for JCov in dynamic mode for Tomcat application

    1. Building the tool itself

    2. It is not that easy to start with JCov. Actually you have to build it yourself using manually downloaded dependencies, as site with released versions doesn’t really work. I have built the final jcov.jar for you, and also I provided intermediate dependencies copied to my repository to save you from hassle looking up the dependencies over the net. But I still recommend to download the sources yourself, as with documentation being scarce going to the sources can show you some hidden functionalities. At least that was my case.

      If you want to prepare it yourself go as follows :

      These are official building instructions, although they seem to be a bit ‘vintage’, as JCov supports modern Java versions, and readme still refers to JDK5. I built it on JDK8.

      1. Clone jcov mercurial repo : hg clone http://hg.openjdk.java.net/code-tools/jcov
      2. You will need following Jar dependencies built/downloaded manually to build JCov (I post exact versions which I used, maybe other will work too) :
      3. Asm-7.0.jar
        Asm-tree-7.0.jar
        Asm-util-7.0.jar
        jtharness-4_5-dev-bin-b27-19_apr_2013/lib/javatest.jar

        I have found them in the various places on the web. You can also download them from my github repository.

      4. Edit /build/build.properties file, setting paths to aforementioned jars.
      5. # path to asm libraries
        asm.jar = /asm-7.0.jar
        asm.tree.jar = /asm-tree-7.0.jar
        asm.util.jar = /asm-util-7.0.jar

        # path to javatest library (empty value allowed if you do not need jtobserver.jar)
        javatestjar = /javatest.jar

      6. Go to /build directory and execute ‘ant’ command. If you got all paths well this should finally build jcov.jar file in /JCOV_BUILD/jcov_/ directory
    3. Using the JCov tool.

    4. jcov.jar embeds all functions needed to generate coverage report, from instrumenting classes to compiling it in .html (or other format) report itself.

      The JCov tool has many sub-tools and usage scenarios – what’s below is a concrete case where dynamic instrumentation is used to generate and dump coverage data on the fly from application server which was not pre-instrumented before running.

      The phases to generate such report are as follows.

      1. Attach jcov.jar to server startup configuration in Java agent mode, exposing commands server
      2. Start the application server
      3. Command the agent to dump data to file (to get rid of server startup Java coverage from statistics )
      4. Run cases where coverage interests us, by any means on the application server (anyhow, for instance python nosetests, postman suite, even manual clicking around the GUI or browser automation plugin)
      5. Dump the coverage data again to file
      6. Generate HTML or other report from coverage data

      Now let’s go through these phases in details:

      1. Attach jcov.jar to server startup configuration in java agent mode, exposing commands server :
      2. In the case of Tomcat server the configuration went as follows :

        -javaagent://jcov.jar=file=important-coverage-data.xml,merge=gensuff,log,log.level=WARNING,include=com\.mycompany*,agent.port=3336

        Let’s explain the params :

        • file=important-coverage-data.xml,merge=gensuff – the root of filename where the coverage data will be dumped to. The ‘gensuff’ merge option makes JCov create separate file for coverage data every time dumping is requested, with a random suffix. There are also other modes of operation, for instance overwriting existing data in xml disk file, or merging its content. For my case the separate files with suffixes seemed to be the most practical mode
        • Log, log.level – you know what it is
        • include=com\.mycompany* – makes java agent instrument only classes in selected packages. By default all classes are instrumented, but beware of that – it may cause excessive memory consumption, even OOM errors for bigger systems. Most probably you’re not interested in coverage data for the external libraries anyway. Note that you can use multiple include and exclude statements to have more sophisticated configuration (refer to jcov.jar built in help for more information)
        • agent.port=3336 – in this configuration simple command accepting server will be started inside agent, waiting for signal to dump coverage data for file. Other options are just dumping data to file on VM exit, or using so called ‘Grabber’ module (read further)
      3. Start the server
      4. Start the server however you do it in normal operation – just make sure it picks up configuration from point 1. In this concrete configuration you will see that it works if something listens on localhost/3336.

      5. Command the agent to dump data to file
      6. In the agent.port configuration, commanding to dump current coverage statistics to XML file turns out to be really crude.

        Connect to the server using : telnet localhost 3336

        Every time you send ‘save’ string to server it will save next part of coverage data to xml file and respond with ‘saved’ string to your telnet session.

        Of course you can automate it, if you need to dump the excessive amounts of data periodically, even in bash, for instance

        (sleep 1; echo "open localhost 3336"; sleep 1; while :; do echo "save"; sleep 60; done ) | telnet

        Or any other telnet client integrated with your testing environment.

        This will dump new portion of data every minute.

        Remember you can merge the results later with Merge command from jcov.jar

      7. Run cases where coverage interests you, by any means on the application server (anyhow, for instance python nosetests, postman suite, even manual clicking around the GUI or browser automation plugin)
      8. Dump the coverage data again to file
      9. As in step three. My practice was to dump data just before running the suite, and just after, and then using just the second file to calculate coverage report.

      10. Generate HTML or other report from coverage data
      11. To generate the report in HTML navigable format, along with the covered lines marked in class sources, execute command like

        java -jar jcov.jar RepGen -sourcepath /path/to/module/one/src:/path/to/module/two/src /path/to/coverage-data.xml

        This will generate, by default, fully navigable HTML report in /report directory below where jcov.jar resides.

  6. Troubles I’ve encountered and you may also

  7. Apart from a lot of time I’ve spent on trivial things, which you don’t have to, after I’ve figured it out – like having to built it with good old Ant or figuring how to signal Agent to dump contents to file, there are some problems which you still may stumble upon:

    • Out Of Memory Errors / HeapSize
      don’t be surprised if your application throws OOMs at you or starts to act strangely. This may be GC dying there. Instrumenting the classes and storing execution statistics may use quite a lot of memory. I had to increase my HeapSize a lot to startup a big set of Tomcat applications, even after including only my company classes in the filter. Also it happened to me that if I haven’t applied ANY filter (just took all the classes) the JVM crashed sometimes instantly at startup. But with filters everything was alright.
    • Multiple source trees

      it was covered in tutorial but I would like to mention it again. Help for jcov.jar mentions just source location – singular. But in fact you can add multiple directories separated with your operating system separator (you can easily find it in JCov sources, going from the parameter name).

    • Spring/CGLib bytecode altering

      sometimes Spring resorts to generate class proxies which apparently don’t just nicely delegate calls to old implementations but dynamically subclass them. Effectively this means that the original class is lost in favor of newly generated one. You will see its statistics in the report, but without relation to original .java source. Another reason to reconsider using Java class inheritance nowadays!

    • Grabber module

      In this article I show how to use server embedded in agent, but in theory, according to JavaOne presentation the standard way to do it is using so called Grabber server, to which the Java agent connects. I’ve successfully set up the Grabber server, but the client didn’t seem to work, so I resorted to server embedded in agent.

    • Untouched classes

      In this example, only loaded classes are instrumented, so remember that if the class is not covered by tests at all, you won’t see it in report. If you have such need, you have to generate so called ‘template’ beforehand.

  8. Ok, what next? -hv is your friend

  9. JCov is quite mysterious tool in terms of community and documentation, but it’s really worth to note, that it has quite reasonable help built in the jcov.jar itself.
    It has many features which go beyond this introduction. For instance merging existing coverage data files, filtering results, even generating diffs from build to build.

    Start from typing :
    java -jar jcov.jar

    You will get a description of the modules of jcov.jar. You can get verbose info about every part by -hv option which stands for ‘help-verbose’. For instance :

    Java -jar jcov.jar Agent -hv

    It is also worth to explore the source code itself near places where command line parameters are processed.

    Also take a detailed look at JavaOne presentation I mentioned.


  10. Why not Jacoco?

  11. At first Jacoco seemed to be better choice, with much higher adoption, bigger community, IDE and CI integrations, but at least in my case it just basically didn’t work. After importing Jacoco coverage data to IntelliJ it was clear, that the lines which for sure were executed were marked as not executed in the report, which beats the purpose. Maybe it was the faulty plugin. I tried to generate report outside the IDE, using Jacoco tools themselves but.. The raport crashes when there are two classes of the same name in different packages, which for as big codebase as in mine case made this tool useless. I have to stress though, that I have previously had good experiences with this product while using Maven/Jenkins plugins for tests executed in JVM so for you it may be worth giving a shot.

  12. Sources

  13. My repository with JCov.jar and prerequisites built
    JavaOne JCov talk slides
    https://wiki.openjdk.java.net/display/CodeTools/How+To+Build+JCov

Geecon Prague 2015 – looking for silver bullet again

Geecon has apparently caught its momentum aspiring to be most recognizable conference brand in central Europe. I have recently attended the fourth installment this year – after TDD Geecon in Poznań, Main Geecon in Krakow and Microservices Geecon in Sopot the time has come for Geecon Prague 2015Let’s go with bunch of insights after this event.

The hype around microservices is still on the rise – the really big part of the talks was devoted to this topic. Other significant part of the lectures was about importance of testing on various levels of abstraction and system architecture needed to be able to make reliable tests achievable.

Since history repeats itself (I really recommend to read THIS article, even though it is sooo long..), there were some talks indicating that again – there is no silver bullet, which is what makes programmers sad, since […] is most fun thing to do – everyone knows it. This silver bullet thing was emerging in various talks. Ranging from quite obvious but not so often asked questions : modular-monolith and whether or not your microservices architecture is just: distributed ball of mud

?

(slides from presentation ‘Modular Monoliths’ by Simon Brown).

Christopher Batey – The Dangers of Building Microservices

A microservice is simple. You can recompile and restart it in seconds. If you stop liking the code of your microservice you can rewrite it from scratch. They are small so they can’t be a big ball of mud can they? The decomposition to a lot of independent services allows to test them easily. Ahh, the world of greenfield programming. No silver bullet. The microservice itself won’t be a big ball of mud – by itself. BUT the vast network of connections between them can be. What a lot of people forgot at first was that word : network. Microservices are easy to test in isolation. But once you send the packets over the wire bad things can happen, and they will happen sooner or later, you can be sure. That is why you need to use tools and write tests which can handle and emulate situations like :

  • slow responding services,
  • not responding services,
  • fast responding services but network working slow for a moment,
  • corrupt messages..

Saboteur, Wiremock, Hystrix.. you should try them!

Antonio Molina & Nick Zeeb – What I Learned Writing a Trillion $

Interesting story built around lessons learned during development of system dealing with massive money amounts going through Forex. Namely 1900000000000$ during one year in their case. Is it even possible to work with such (continuously deployed!) system without 80% heart stroke rate per year among developers? It is. It wouldn’t be without the knowledge they’ve learned till this day. And some of the lessons I noticed are:

  1. Test like crazy. They have 10 000 acceptance tests. And many times more unit tests. Micro performance tests. Meso performance tests. Macro performance tests. Failure tests. Data migration tests. Live tests on real market with small amounts of money. Of course it comes at a cost. 80% of development time.  And.. :

invest But they discovered that it pays out at the end. Otherwise… : why-loss 2. Sometimes less is more. LMAX discovered that in their case spreading across multiple repositories brought more pain then gain. Also fancy ‘branch per feature’ attitude was ditched. Oh, and there is one slightly bigger company which does similar thing now : google-single-repo There is also some team culture involved here : always try to develop as small working piece of code as you can. And the whole team also tries to focus on the smallest achievable piece of functionality at time.

3. Accept your inferiority at estimating performance of your algorithms working in the world of deep stack traces. You know the big ‘O’ notation? That is great, you can prove you were right in the performance test you will write. But some day it will prove you were so, so wrong, and it will allow to avoid big ‘O’ on your face on the release day.

      No silver bullets, but going to attend Geecon next year to try find one, for sure! 😉

Cannot resolve symbol XYZ in IntelliJ although maven builds sources properly

Today I stumbled upon a weird problem with IntelliJ IDE.

I had to update one of maven dependencies version, which introduced a new class I wanted to use.
It was possible to build the project without any problem by mvn install, but still the editor marked it red with message ‘cannot resolve symbol’.
Consequently it was for instance impossible to run single unit tests from IDE, because IntelliJ considered this code as invalid to compile.

I tried invalidating caches, reloading pom.xml etc. but to no avail.

Finally it turned out that for some reason IntelliJ is incapable to track changes in maven artifacts versions, if the version is given in a variable like :

<dependency>
      <groupId>com.jetdrone</groupId>
      <artifactId>yoke</artifactId>
      <version>${yoke.version}</version>
    </dependency>

The workaround goes as follows :

1. Replace ${yoke.version} with raw string version
2. Go to File -> Settings … -> (search) maven -> repositories and click ‘Update’ on your local maven repository.
3. Invalidate caches. (File -> Invalidate caches / restart … )
4. Now it should work
5. You can replace tag content with variable reference again

I wasted quite a lot of time for that, so maybe it will help someone with similar issue 😉

Exposing Activiti BPMN Engine events via websockets extending its REST application

The latest release of Activiti (5.15) introduced a new interesting mechanism – support for event handlers. You can now get notified just when something interesting happens inside process engine. (You can read more on events in Activiti HERE).

I instantly thought that would be even more interesting if external applications integrated with Activiti REST could also get easily notified about the process events. Using websockets as a medium for broadcasting these messages seemed quite natural.
In this article I will present step by step how I extended the default REST application to serve events via websockets. I also developed a client test consuming these events to prove the thing is working, it will also be described later on.

Prerequisites

You can git clone or download zip package with sources of this project here:

https://github.com/wrobelm/activiti-websockets

To run it just go to activiti-websockets-server directory and execute

mvn jetty:run

This will build the server application and run it in maven embedded jetty server.

Then to run automated client test go to activiti-websockets-client directory and execute

mvn test

You will need Maven running on JDK 8 to run the test project.

activiti-websockets server project

Broad view on main concepts :

  • The project is based on original Activiti REST application. It is included in the project as maven dependency and extended where needed.
  • The websockets server has been developed using standard Java EE7 web-api libraries. They are compatible with modern webapp containers like Jetty 9 and Tomcat 8.

Let’s go into some more details:

1. How websocket server is hooked to the Activiti REST webapp?

The standard web.xml file has been altered to use a custom ServletContextListener :

...
 <listener>
 <listener-class>pl.mwrobel.activiti.extensions.CustomActivitiServletContextListener</listener-class>
 </listener>
...

This listener extends default ActivitiServletContextListener.

Look at its source code now:

public class CustomActivitiServletContextListener extends ActivitiServletContextListener {
private static final Logger log = LoggerFactory.getLogger(CustomActivitiServletContextListener.class);
@Override
 public void contextInitialized(ServletContextEvent event) {
 super.contextInitialized(event);
 event.getServletContext().setAttribute("activiti-engine", ProcessEngines.getDefaultProcessEngine());
 addProcessEventsEndpoint(event);
 }
public void addProcessEventsEndpoint(ServletContextEvent ce) {
 log.info("Deploying process-engine-events websockets server endpoint");
 ServletContext sc = ce.getServletContext();
final ServerContainer server_container
 = (ServerContainer) ce.getServletContext().getAttribute("javax.websocket.server.ServerContainer");
try {
 ServerEndpointConfig config
 = ServerEndpointConfig.Builder.create(ProcessEngineEventsWebsocket.class,
 "/process-engine-events").build();
 config.getUserProperties().put("servlet.context", sc);
 server_container.addEndpoint(config);
 } catch (DeploymentException e) {
 throw new RuntimeException(e);
 }
 }
}

Two most important lines:

ServerEndpointConfig config
 = ServerEndpointConfig.Builder.create(ProcessEngineEventsWebsocket.class,
 "/process-engine-events").build();

New endpoint will be deployed to path “/process-engine-events“, it is implemented by websocket server class ProcessEngineEventsWebsocket.

event.getServletContext().setAttribute("activiti-engine", ProcessEngines.getDefaultProcessEngine());

Process engine reference is stored in ‘activiti-engine‘ servlet context attribute, it allows to be later acquired in ProcessEngineEventsWebsocket class.

2. Websocket server endpoint

public class ProcessEngineEventsWebsocket extends Endpoint {
private static final Logger log = LoggerFactory.getLogger(ProcessEngineEventsWebsocket.class);
private ServletContext servletContext;
 private ProcessEngine processEngine;
@Override
 public void onOpen(final Session session, EndpointConfig config) {
 log.info("Websockets connection opened");
this.servletContext = (ServletContext) config.getUserProperties().get("servlet.context");
 processEngine = (ProcessEngine) servletContext.getAttribute("activiti-engine");
 processEngine.getRuntimeService().addEventListener(new ActivitiProcessEventsWebsocketBroadcaster(session));
 }
}
processEngine = (ProcessEngine) servletContext.getAttribute("activiti-engine");

When a new websockets session is opened, processEngine is retrieved from the servlet context attribute stored in servlet context listener.

processEngine.getRuntimeService().addEventListener(new ActivitiProcessEventsWebsocketBroadcaster(session));

Registering ActivitiProcessEventsWebsocketBroadcaster as an event listener for all events published by the Activiti engine. Websockets session object is passed to the event handler to allow sending messages to the client.

3. ActivitiProcessEventsWebsocketBroadcaster – Activiti events handler

This class implements ActivitiEventListener interface. Every time new event occurs this method gets called:

@Override
 public void onEvent(ActivitiEvent event) {
 switch (event.getType()) {
 case ACTIVITY_STARTED: {
 broadcastEvent((ActivitiActivityEvent)event);
 break;
 }
 case ACTIVITY_COMPLETED: {
 broadcastEvent((ActivitiActivityEvent)event);
 break;
 }
 }
 }

I decided to publish two selected types of Activiti events. Every time they occur they are broadcasted to the client. Quite self-explanatory.

private void broadcastEvent(ActivitiActivityEvent e) {
 ProcessEventDTO dto = ProcessEventDTO.builder().activityId(e.getActivityId())
 .activityName(e.getActivityId())
 .activityType(e.getType().toString())
 .processId(e.getProcessInstanceId())
 .build();
 log.info("Activiti event received: " + e.getType());
 RemoteEndpoint.Basic remoteEndpoint = session.getBasicRemote();
 try {
 remoteEndpoint.sendText(om.writeValueAsString(dto));
 } catch (IOException ex) {
 throw new RuntimeException(ex);
 }
 }

Here the event properties are wrapped in transport class object (ProcessEventDTO), then serialized to JSON using Jackson ObjectMapper, and send to the client as string. The client endpoint is retrieved from the session object injected in ProcessEngineEventsWebsocket during session opening.

And that’s basically all about publishing Activiti events via websockets.

Example process and websockets client as test case

To demonstrate it working I developed a simple BPMN process which gets auto deployed during start of the REST webapp.
It is done in DemoDataGenerator class substituting the original class from REST webapp. The file ‘event-demo-process.bpmn20.xml‘ is deployed in method initDemoProcessDefinitions(). You can find this process in graphic notation below:

events-process

Notes:
– first-task and second-task just print some info to server log
– timer-catch waits 8 seconds, then process is resumed again

Let’s move to activiti-websockets-client project

I developed a client going automatically through this process in the form of unit test.

The main test method goes as follows:

@Test
 public void processShouldComplete() throws InterruptedException, Exception {
 final ProcessEventsWebSocket ws = con.getProcessEventsWebsocket();
String processInstanceId = createProcessInstance();
ws.addExpectedEventAndWait(15000, ProcessEventDTO.builder()
 .activityName("user-task").processId(processInstanceId).activityType("ACTIVITY_STARTED")
 .build());
completeUserTask(processInstanceId);
 checkIfProcessHasFinished(processInstanceId);
 }

Notes:
– A new process instance is created using Activiti REST services. You can read documentation of these services HERE. I used Jersey client for dealing with HTTP requests and again Jackson for deserialization of JSON objects representing server events.
– In class ProcessEventsWebSocket using eclipse jetty websocket client library and CountDownLatch (you can read more about it HERE) I implemented simple mechanisms to wait for an event with given field values set. For demo purposes every event incoming in onMessage() method is printed to standard output.
– In this particular test we wait for ACTIVITY_STARTED event of ‘user-task‘, then we can push process forward in completeUserTask() method, finally we check if process has finished. It is also achieved using Activiti REST API mentioned before.

The output of the test should be like:

Running pl.mwrobel.activiti.websockets.test.ActivitiWebsocketsIntegrationTest
 2014-04-28 12:54:52.081:INFO::main: Logging initialized @863ms
 http://127.0.0.1:9080/service/runtime/process-instances
 {"processId":"287","activityId":"start-event","activityName":"start-event","activityType":"ACTIVITY_STARTED"}
 {"processId":"287","activityId":"start-event","activityName":"start-event","activityType":"ACTIVITY_COMPLETED"}
 {"processId":"287","activityId":"first-task","activityName":"first-task","activityType":"ACTIVITY_STARTED"}
 {"processId":"287","activityId":"first-task","activityName":"first-task","activityType":"ACTIVITY_COMPLETED"}
 {"processId":"287","activityId":"timer-catch","activityName":"timer-catch","activityType":"ACTIVITY_STARTED"}
 {"processId":"287","activityId":"timer-catch","activityName":"timer-catch","activityType":"ACTIVITY_COMPLETED"}
 {"processId":"287","activityId":"second-task","activityName":"second-task","activityType":"ACTIVITY_STARTED"}
 {"processId":"287","activityId":"second-task","activityName":"second-task","activityType":"ACTIVITY_COMPLETED"}
 {"processId":"287","activityId":"user-task","activityName":"user-task","activityType":"ACTIVITY_STARTED"}
 MATCHED ****ProcessEventDTO(processId=287, activityId=null, activityName=user-task, activityType=ACTIVITY_STARTED, customActivityType=null)
 {"processId":"287","activityId":"user-task","activityName":"user-task","activityType":"ACTIVITY_COMPLETED"}
 {"processId":"287","activityId":"end-event","activityName":"end-event","activityType":"ACTIVITY_STARTED"}
 {"processId":"287","activityId":"end-event","activityName":"end-event","activityType":"ACTIVITY_COMPLETED"}
 closed connection
 Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 9.358 sec

And that is all on Proof of Concept integration of Activiti with websockets server. Hope it helped someone 😉

Postscript:

My thread on Activiti forums about somewhat more complicated example involving signals and problems regarding transactions and events in Activiti:
http://forums.activiti.org/content/possible-bug-eventing-activiti-not-ready-receive-signal-even-after-dispatching

So – in some cases this concept may still need some refining 😉

Resources :

Article on Java EE7 Websockets:
http://www.oracle.com/technetwork/articles/java/jsr356-1937161.html
CountDownLatch:
http://docs.oracle.com/javase/7/docs/api/java/util/concurrent/CountDownLatch.html
Detailed Activiti user guide:
http://www.activiti.org/userguide/
Jackson JSON Processor
http://jackson.codehaus.org/

How to draw semicircle on Android Canvas?

I just resolved a problem which at first I thought is a matter of 5 min for finding appropriate Android API function, unfortunately it took me much longer..

Problem:
Given start and end points of a vector, draw left/right hand semicircle knowing, that this vector is diameter of this semicircle.

Solution:

Use addArc android function.

public void addArc (RectF oval, float startAngle, float sweepAngle)
http://developer.android.com/reference/android/graphics/Path.html#addArc(android.graphics.RectF, float, float)

As you can see we need an oval rect and startAngle, which are not obvious..
Hereunder you can find a handy method computing needed parameters.

/**
     *
     * @param xStart vector start point
     * @param yStart
     * @param xEnd vector end point
     * @param yEnd
     * @param ovalRectOUT RectF to store result
     * @param enum direction left/right
     * @return start angle
     */
    public static float getSemicircle(float xStart, float yStart, float xEnd,
            float yEnd, RectF ovalRectOUT, Side direction) {

        float centerX = xStart + ((xEnd - xStart) / 2);
        float centerY = yStart + ((yEnd - yStart) / 2);

        double xLen = (xEnd - xStart);
        double yLen = (yEnd - yStart);
        float radius = (float) (Math.sqrt(xLen * xLen + yLen * yLen) / 2);
    
        RectF oval = new RectF((float) (centerX - radius),
                (float) (centerY - radius), (float) (centerX + radius),
                (float) (centerY + radius));

        ovalRectOUT.set(oval);

        double radStartAngle = 0;
        if (direction == Side.LEFT) {
            radStartAngle = Math.atan2(yStart - centerY, xStart - centerX);
        } else {
            radStartAngle = Math.atan2(yEnd - centerY, xEnd - centerX);
        }
        float startAngle = (float) Math.toDegrees(radStartAngle);

        return startAngle;

    }

After that drawing semicircles is deadsimple with eg.

path.addArc(oval, startAngle, 180);

Hope that helps! 😉

If you need to manipulate circle pieces a bit more specifically I really recommend you this article:
http://www.tbray.org/ongoing/When/200x/2009/01/02/Android-Draw-a-Curved-Line

How to configure Eclipse to debug Alfresco java webscript code

How to configure Eclipse to debug Alfresco java webscript code

NOTE: It is assumed, that Eclipse environment has been configured to work with Alfresco SDK. If it’s not true the configuration process has been described here: http://wiki.alfresco.com/wiki/Alfresco_SDK_3.4#Set_Eclipse_Compiler_Compliance_Level_to_6.0.

1. Select: Run menu -> Debug configurations…
2. Create new configuration for ‘Remote Java Application’
3. In ‘Project’ field select ‘home’ project (presumably your webscript project), which you want to debug (but it is also possible to debug code from Alfresco core outside selected project)
4. In ‘Connection properties’ set Tomcat server IP address and debug port you defined earlier.

NOTE: It’s advised against to use ‘localhost’ instead of IP address on Windows because of hosts file issues.

5. Click Apply, Debug
6. If everything went correct you shouldn’t get any message.
7. Switch to debug perspective
8. You should see something like this:

Proper screen from eclipse debug perspective

Proper screen from eclipse debug perspective


In case of debug session disconnection (eg. Tomcat restart) you may reconnect to it using Relaunch option:
Reconnecting to debugger

Reconnecting to debugger


9. Now you can normally set breakpoints and trace code execution of your webscript or other Alfresco SDK projects. A good test is setting a breakpoint in org.alfresco.web.bean.LoginBean at line
“FacesContext context = FacesContext.getCurrentInstance();” – breakpoint should be caught at Alfresco Explorer login attemp.

NOTE: You may get following error message: “Unable to install breakpoint in org.evolpe.webscripts.XMLwebs due to missing line number attributes. Modify compiler options to generate line number attributes.
Reason:
Absent Line Number Information”

Missing line info window

Missing line info window

CAUSE&SOLUTION: To debug your own webscripts or other code it’s essential to enable code line number information in the compiler. If you use ant, set debug=”true’ parameter in javac task.

How to configure tomcat to debug instance of Alfresco on Windows and Linux will be described soon in my next articles.;)

How to perform form field validation in Alfresco Share?

It’s possible to implement your own field validation function in Alfresco Share in JS by overwriting Share mandatory constraint which uses
Alfresco.forms.validation.mandatory located in /opt/alfresco-3.4.d/tomcat/webapps/share/js/forms-runtime.js
The default configuration can be suppressed the way described here:http://wiki.alfresco.com/wiki/Forms#constraint-handlers
validation-handler parameter is the name of your own JS function.
Here you can find description of validation function parameters http://wiki.alfresco.com/wiki/Forms_Developer_Guide#Validation_Handlers

In the current version of Alfresco (3.4d) the validation ‘fails silently’ which means you can’t dynamically display error message. When validation fails it’s not possible to submit the form. There are foundations for enhancement of this mechanism so it’s quite possible the ‘full’ validation will be available in near future.
You can

download full eclipse ant project for my example here

. I will provide
instructions for manual installation as well.
Ant installation
1. Set paths to tomcat directory in build.properties
TOMCAT_HOME=/opt/alfresco-3.4.d/tomcatAPP_TOMCAT_HOME=/opt/alfresco-3.4.d/tomcat
2. Just run ‘ant’ in project directory.

So, let’s start the core of this tutorial.
1. I implemented my own simple data model for this example. As it is not complicated, nor directly connected with validation its self I just attach needed files with my project.If you’re not familiar with defining data models you can read about it for instance, here
tutorial:http://www.tribloom.com/blogs/michael/2011/04/18/alfresco-repository-webscript/

detailed documentation: http://wiki.alfresco.com/wiki/Data_Dictionary_Guide#The_Data_Dictionary
validationModel.xml goes to /opt/alfresco-3.4.d/tomcat/webapps/alfresco/WEB-INF/classes/alfresco/extension/model
validation-model-context.xml goes to /opt/alfresco-3.4.d/tomcat/webapps/alfresco/WEB-INF/classes/alfresco/extension/

2. share-config-validation-custom.xml goes to /opt/alfresco-3.4.d/tomcat/webapps/share/WEB-INF/classes/alfresco/web-extension
This file contains definition of create and edit forms for the new data type.
Piece of code concerning validation:


<field id="ex:gross" label-id="example.price.gross" mandatory="true"
	help-id="example.price.gross.help">
	<constraint-handlers>
		<!-- validation-handler param: js function name -->
		<constraint type="MANDATORY"
			validation-handler="Alfresco.forms.validation.exampleGrossValidation"
			event="keyup" />
	</constraint-handlers>
</field>

 

Alfresco.forms.validation.exampleGrossValidation is the name of js validation
function.
3. exampleFormValidation.js and exampleFormValidation-min.js go to/opt/alfresco-3.4.d/tomcat/webapps/share/components/form

<pre class="brush: javascript; gutter: true">Alfresco.forms.validation.exampleGrossValidation = function exampleGrossValidation(field, args,
  event, form, silent, message) {
 

  var valid = true;

  valid = YAHOO.lang.trim(field.value).length !== 0;  // check if field is not empty

// referencing to form fields is possible by following convention:
// field object passed to function representing validated field has an object form representing the whole form  
// field object names are created by concating 'prop_'+{namespace from data model}_+{property name from data model}  
  var net = field.form.prop_ex_net.value;
  var gross = field.form.prop_ex_gross.value;
  var rate = field.form.prop_ex_rate.value;

  var check = ((rate/100) + 1) * net;

  if((Math.abs(gross - check)) &gt; 0.01)  // check if gross has a valid value with given precision
{
valid = false;
}

  return valid;
};

 
4. exampleValidation_pl_PL.properties (labels for forms) goes to/opt/alfresco-3.4.d/tomcat/webapps/share/WEB-INF/classes/alfresco/messages


You have to modify following files:


1. slingshot-application-context.xml at /opt/alfresco-3.4.d/tomcat/webapps/share/WEB-INF/classes/alfresco/
Paths to share-config-validation-custom.xml and exampleValidation_pl_PL.properties go there
At xPath ‘/beans/bean/constructor-arg/list’ add:

<value>classpath:alfresco/web-extension/share-config-validation-custom.xml</value>
At xPath ‘/beans/bean/property/list’ add:

<value>alfresco.messages.exampleValidation</value>

2. form.get.head.ftl – link to .js fileat /opt/alfresco-3.4.d/tomcat/webapps/share/WEB-INF/classes/alfresco/site-webscripts/org/alfresco/components/form
Add: <@script type=”text/javascript” src=”${page.url.context}/res/components/form/exampleFormValidation.js”></@script>

3. toolbar.get.config.xml – In this file we can add a new option in menu ‘Create content’ in Alfresco Share Site.
at /opt/alfresco-3.4.d/tomcat/webapps/share/WEB-INF/classes/alfresco/site-webscripts/org/alfresco/components/documentlibrary/
add :
<content mimetype=”text/xml” icon=”xml” label=”example.price.menu” itemid=”ex:price”/>

Alfresco document preview doesn’t work for MS Office files (.doc, .docx, .ppt .pptx etc) although it works for pdf files. I get “This document can’t be previewed.” message.

For reasons which are not obvious for me the installation and starting of alfrescoOpenOffice service doesn’t
always work properly during automated Alfresco install/start script execution. If you have the same problem
here is the solution for windows environment.
1. Start – > run  – > services.msc Check if alfrescoOpenOffice service is listed. If not go to {alfresco install dir}\openoffice\scriptsand execute “openoffice_serviceinstall INSTALL” with administrative privileges.

2. Every time after starting alfresco by {alfresco install dir}\servicerun START execute also{alfresco install dir}\openoffice\scripts\openoffice_servicerun START.

3. If you’re not happy with auto starting openoffice service with Windows change alfrescoOpenOffice execution
type to “Manual”.

4. MS Office documents preview in Alfresco should be working just fine now 🙂

I get “Unable to locate the Javac Compiler” when trying to build using IAM Maven Eclipse plugin.

Problem:
When trying to build maven project using IAM under Eclipse I get an error like this:


"Unable to locate the Javac Compiler in:C:\\Program
Files\\Java\\jre6\..\\lib\\tools.jar
Please ensure you are using JDK 1.4 or above and
not a JRE (the com.sun.tools.javac.Main class is required).
In most cases you can change the location of your Java
installation by setting the JAVA_HOME environment variable"

,nevertheless I have proper installation of newest Java JDK and JAVA_HOME is properly set.
The problem never occured before.

Solution:

I know two possible causes of that problem:

1. Eclipse uses its own JRE to start, and IAM does the same. You’ve recently installed a new maven plugin and it requires JDK.
You can force Eclipse to start using directly given Java VM by starting it that way from command line:

“X:\\path\to\\eclipse\\eclipse.exe -vm JAVA_JDK_PATH”

2. There also cases when IAM brokes up without apparent cause, and that happend to me.
The reason may be IAM itself.
Try standard tricks checklist from my another post here:
Problems using Eclipse IAM and Maven

I get “The processing instruction target matching “[xX][mM][lL]” is not allowed.” error during start of Alfresco.

If you just defined a new content type in Alfresco the cause may be empty characters at the beginning of the model context file, nevertheless it’s grammatically correct the parser can’t get through.

Eg. it may result in a stack trace similiar to:


org.springframework.beans.factory.parsing.BeanDefinitionParsingException: Configuration problem:
Failed to import bean definitions from URL location [classpath:alfresco/application-context.xml]
Offending resource: ServletContext resource [/WEB-INF/web-application-context.xml];
nested exception is org.springframework.beans.factory.parsing.BeanDefinitionParsingException:
 Configuration problem: Failed to import bean definitions from URL location
 [classpath*:alfresco/extension/*-context.xml]Offending resource: class path resource
[alfresco/application-context.xml]; nested exception is
org.springframework.beans.factory.xml.XmlBeanDefinitionStoreException:
Line 1 in XML document from file [D:\\xxxxxx-model-context.xml] is invalid;
nested exception is org.xml.sax.SAXParseException:
The processing instruction target matching "[xX][mM][lL]" is not allowed.
at org.springframework.beans.factory.parsing.FailFastProblemReporter.error(FailFastProblemReporter.java:68)
....

Also watch out for the empty characters at the end of file paths in configuration xmls cause problems, alfresco can’t find these files.

Subscribe by mail