Archive

Archive for the ‘Software’ Category

A new project with TypeScript and Angular

July 2, 2018 Leave a comment

More than a year ago, I started a new adventure in a new startup company. New company, new adventure and a new project. New technology maybe?
Of course the risk of adopting a new technology in a new project is lower than migrating to a new technology in an existing one, but there still is a risk. Especially if the technology is young and almost no one in the team has experience with it.

I’m working in web projects for almost twenty years and with JavaScript for all this period. It is said that JavaScript is the least understood language. And even though you understand it you need a very high level of discipline in designing your application and writing you code if you want to keep away from spaghetti code. One of the biggest issues with JavaScript in my view is that it’s not a strong type language. In the past in my code I even tried to bring classes in JavaScript. But this solves the problem only partially.

You can understand my enthusiasm when I saw TypeScript. A strong typed language for the web. Yoohoo! And an entire framework built on top – Angular. Angular, not AngularJS. I worked with both frameworks, but basically what they have in common is the name. Angular is also known as the next version of AngularJS, or Angular 2, 4, 5, 6 …

Now coming back to the project. I proposed for it as the development language/framework the new TypeScript/Angular. At that moment it seemed like a big risk: no one in the team was used it before and even myself I have used in only couple of projects, none of which made it into production. But now, in retrospective, I believe it was one of the best decisions when it comes to technology selection for a new project.

I would not insist too much about TypeScript and Angular, but I still would like to point out a few advantages that I really like to make my case.

TypeScript

TypeScript it’s a strong typed language for the web with a lot of similarities with JavaScript. It’s not an interpreted language, but a hybrid one that compiles to JavaScript. This way you’ll catch a lot of errors right in the development phase, even better they’ll be flagged by your favorite IDE/editor.

Looking into the future, I think new projects and libraries should be written in TypeScript, even the ones targeting JavaScript. TypeScript is interoperable with JavaScript, the code compiles to JavaScript and the library is augmented also with type information for TypeScript users. The compiled script is optimized, obfuscated and easy to integrate. JavaScript acts like some kind of assembler code in this case.

A lot of the TypeScript improvements came to JavaScript through the latest ECMAScript standards, but not all are widely supported. There are also initiatives for native TypeScript support directly in the browser. But I would still see quite a few advantages from the ones outlined above still standing in a hybrid approach (compiled + interpreted).

In conclusion, I believe TypeScript is the modern and the best choice when it comes to programming languages for the web. It’s so cool, that sometimes I cannot believe it was made by Microsoft. Of course, it was a joint effort and maybe this approach will make them think about their future in a more and more open community.

Angular

Angular is the perfect companion as a framework for TypeScript. Its component-ized approach could seem an overkill in the beginning, but in an enterprise project you’ll quickly see its value. Components can be easily isolated and reused. It’s so easy to develop such a component, that sometimes could be easier to develop your own from scratch than customize an existing 3rd party one. Of course, this should be the exception, rather than the rule :).

As I said earlier AngularJS and Angular have basically only the name in common. Because of that it’s pretty hard to upgrade from the former to the latter. Quite the opposite is to upgrade between different versions of Angular as they maintain a high level of backward compatibility and features are deprecated progressibely. Usually it took me just a few hours to upgrade from Angular 2 to 4, from 4 to 5, from 5 too 6. The fact that TypeScript is strong typed, the compiler, or even better the IDE, points out the errors, making it extremely easy and straightforward.

Of course, a homogeneous product is the ideal case, but those are so rare … We had to integrate our project with an existing one built on AngularJS. It was like a case study – how to upgrade and interoperate between Angular’s. Angular came with a nice rescue solution here and with a decent effort we came up with a clean way of doing this. I will not enter into details here, but the nicest part in that, which definitely gained my vote, was that you could actually upgrade module by module, or even component by component. And the effort finally paid off when we started to reuse parts of the new project into the old one.

If you want to start a new project, Angular is a very well equipped framework that comes out of the box with TypeScript linter and compiler, webpack, SCSS support, unit and automation testing, polyfills etc. AngularJS did not have an official scaffolding tool, but Angular has Angular CLI which does a nice job.

TypeScript and Angular offered us a development landscape with emphasize on ease of development, less errors and lots of reusing opportunities. I think it was the best foundation on top of which we could build a modular toolkit, based on atomic design principles. We also managed to create a continuous build system, where the code was lint-checked and compiled for different environments, catching a lot of issues, right from that phase, a much harder or even impossible endeavor with JavaScript and other frameworks. We also integrated unit and automation tests and we’re working on extending the coverage of these. This will give us the confidence to build new features at higher speeds and shorten release cycles.

So, every time when you start a new project, especially if you’re unhappy with your development ecosystem, try investigating new ones – technologies are evolving nowadays at much higher speeds. For the past decade or so, the biggest issue in software and web development is maintainability, even before performance. And more importantly, don’t be afraid of change, embrace it.

Categories: Software, Web

PhoneGap setup

March 22, 2016 1 comment

It’s not the first time that I played with PhoneGap, but I haven’t done in quite some time. But I always liked the idea of creating a platform independent application. And if that application can be tested directly in the web browser, even better.

Creating a user interface in a descriptive language like HTML is easier as opposed to a programmatic approach where you have to write code to create your visual components. Nowadays, most frameworks also offer the descriptive approach, usually through XML, but learning a new language when you already know another one more powerful is not that appealing. HTML is also augmented by CSS that easily offers a high degree of customization and JavaScript that comes along with functionality. And all together create a platform-independent framework with a high degree of customization and a clear separation of layers.

So it’s clear why I like the idea of PhoneGap right from the start. Now, let’s set it up.

To develop a Phonegap application you don’t need to many things. The best thing will be to install nodejs and then phonegap: npm install -g phonegap.

Then you can create a sample application with phonegap create my-app, command which will create all the necessary files and subfolders under my-app folder.

Now it comes the testing part and for this you need to install PhoneGap Desktop. As I said, it’s nice that you can test your app directly in your browser by visiting the link displayed at the bottom of Phonegap Desktop window, e.g. http://192.168.0.1:3000 (hint: it doesn’t work with localhost or 127.0.0.1). And if you install PhoneGap Developer App you can easily test on your mobile too without the hassle of installing the application itself every time you make a change – changes will be automatically deployed (reloaded).

When you’re done it comes the fun part – actually building the application. Let’s do this for Android.

First you need to install JDK (I tested with version 8) and Android Studio.

And then you need to setup some enviroment variables

  • JAVA_HOME – this must be set to the folder where your JDK, not JRE, is installed.
  • ANDROID_HOME – this must be set to the folder where your Android environment is installed.
  • add to PATH the following %ANDROID_HOME%\tools;%ANDROID_HOME%\platform-tools;%JAVA_HOME%\bin in Windows or ${ANDROID_HOME}\tools;${ANDROID_HOME}\platform-tools;${JAVA_HOME}\bin in Linux

If the above are not correctly set or the PATH is invalid (like it has an extra quote(“) or semicolon(;)) you can run into errors like

  • Error: Failed to run "java -version", make sure that you have a JDK installed. You can get it from: http://www.oracle.com/technetwork/java/javase/downloads. Your JAVA_HOME is invalid: /usr/lib64/jvm/java-1.8.0-openjdk-1.8.0
  • Error: Android SDK not found. Make sure that it is installed. If it is not at the default location, set the ANDROID_HOME environment variable.

I also had to run

phonegap platforms remove android
phonegap platforms add android@4.1.1

By default I had installed Android 5.1.1, but I was getting the error Error: Android SDK not found. Make sure that it is installed. If it is not at the default location, set the ANDROID_HOME environment variable. You can check what platforms you have installed by running the command phonegap platforms list.

Make sure that you have all the Android tools and SDKs installed by running android on the command line and select all the ones not installed and install them.

Finally, you can build the application by running the following command in your project folder:

phonegap build android

and if everything goes well you’ll find your apk at <your-project-dir>/platforms/android/build/outputs/apk.

Categories: Software, Web

I had it with Maven

April 6, 2015 3 comments

Initially I was a big Maven fan. It was the best build tool. When coding in C I was using make and after switching to Java, naturally Ant was next.

But Maven was so much better. It had a few things that you cannot do anything but love them. First of all, dependency management. To get rid of downloading the jars, include them in a lib folder, and even the bigger pain of updating them … wow … to get rid of all of these was a breeze.

Maven had also a standard build lifecycle and you can download a project and start a build without knowing anything about it. You could have done this in Ant, but there projects should have followed a convention, which wasn’t always the case. In all honesty, it wasn’t even existing one :), at least a written formal one.

And then Maven came with a standard folder structure. If you pickup a new project it’s clearly easier to understand it and find what you’re looking for.

And … that’s about it. I would like to say that the fact that uses XML was also a good thing. XML is powerful because it can be easily understood by both humans and computers (read programs). But no other tool, except Maven, was interested in understanding its POM. And if for describing and formalizing something, like a workflow, XML is great, using it for imperative tasks – not so. Ant was doing this and going through an Ant build wasn’t the easiest task of all.

Maven was also known for its verbosity. If you go to mvnrepository.com and take any package in there you’ll clearly see the most verbose one:

<dependency>
    <groupId>org.thymeleaf</groupId>
    <artifactId>thymeleaf</artifactId>
    <version>2.1.4.RELEASE</version>
</dependency>

as compared to Gradle for example

'org.thymeleaf:thymeleaf:2.1.4.RELEASE'

.

And that’s for adding just one dependency. If you have 30 or more in your project, which for a web application is not uncommon, you end up with a looot of code …

And then it comes the customization. It is very good that Maven comes with a standardized lifecycle, but if it’s not enough (and usually it isn’t) it’s very hard to customize it. To do something really useful you’ll need to write a plugin. I know that there is the exec plugin, but it has some serious drawbacks and the most important one is that it cannot do incremental builds. You can simply run a command, but you cannot do it if only some destination files are outdated compared to their corresponding sources.

So, I needed something else. I looked a little bit over some existing build tools and none of them seemed very appealing, but I ended up switching to Gradle. Two reasons: Spring is using it (I’m a big fan of many Spring projects) and I wanted to also get acquainted with Groovy.

While switching a web application project, which took me 3 days, I ended up with a 10kb build file instead of 36Kb and many added features. I was able to build my CSS and JS using Compass/Sass and browserify and more importantly incrementally (but as a whole).

I was also able to better customize my generated Eclipse project, including specify derived folders. As a side note, for improved Gradle support in Eclipse you need to install Gradle IDE from update site http://dist.springsource.com/release/TOOLS/update/e4.4/ and miminum required version is 3.6.3 – see here why. You may need to uncheck Contact all update sites during install to find required software if you get an installation error.

Gradle is probably not the dream build tool, it has a very loose syntax, it combines descriptive properties with imperative task reducing readability, but it’s less verbose and much more flexible than Maven. Probably with some coding standards applied on top, it could become a good choice.

Categories: Software Tags: , ,

DomELResolver

September 6, 2012 2 comments

Starting with JSP 2.1 there is a very nifty feature: ELResolvers. Even there’s a little less popular, it can make your code nice and clean.
It is well known, that scriptlets are a no-go for JSP development. As an alternative, you can use custom tags and expression language. The nice thing about EL is that it is easily extensible. You can define custom functions, using tag library descriptors, and through resolver, you can plugin bean property resolution.

Now I’ll walk you through a full blown example of working with ELResolvers.

I don’t know if you ever had to work with XML documents inside your JSP, but the options aren’t so good and in the end your code will look ugly, either you use scriptlets (totally not recommended), complex custom tags or custom functions. I acknowledged that XML/DOM processing should happen mostly on the controller side (servlet, Spring controller or other), but there are times when you need it on JSP, like some kind of transcoder functionality.
JSTL offers a set of tags for XML processings. Still feels a little bit hard to work with. Just imagine you would have to use the value of an XPath into an attribute.

<x:set var="title" select="$doc//title"/>
${title}

It would be nice if you could do it directly: ${doc["//title"]}. And now we get to the EL Resolver. It practically instructs EL engine how to evaluate a property for a bean.

Implement an EL Resolver

As the code will tell better the story, here it is.

import java.beans.FeatureDescriptor;
import java.util.ArrayList;
import java.util.Iterator;

import javax.el.ELContext;
import javax.el.ELException;
import javax.el.ELResolver;

import org.jaxen.JaxenException;
import org.jaxen.XPath;
import org.w3c.dom.Element;
import org.w3c.dom.Node;
import org.w3c.dom.NodeList;

public class DomELResolver extends ELResolver {

	@Override
	public Class<?> getCommonPropertyType(ELContext context, Object base) {
		if (base instanceof NodeList)
			return Integer.class;
		return String.class;
	}

	@Override
	public Iterator<FeatureDescriptor> getFeatureDescriptors(ELContext context, Object base) {
		return null;
	}

	@Override
	public Class<?> getType(ELContext context, Object base, Object property) {
		return getValue(context, base, property).getClass();
	}

	@Override
	public Object getValue(ELContext context, Object base, Object property) {
		if (property == null) {
			return null;
		}
		// get the property as a string
		String propertyAsString = property.toString();
		
		if (base instanceof ListOfNodes) {
			// if the base object is a list of nodes, the property must be an index
			int index = getPropertyAsIndex(property);
			if (index >= 0) {
				if (context != null) {
					context.setPropertyResolved(true);
				}
				return ((ListOfNodes)base).get(index);
			} else {
				base = ((ListOfNodes)base).get(0);
			}
		}
		
		if (base instanceof NodeList && !(base instanceof Node)) {
			// if the base object is a DOM NodeList, the property must be an index
			int index = getPropertyAsIndex(property);
			if (index >= 0) {
				if (context != null) {
					context.setPropertyResolved(true);
				}
				return ((NodeList)base).item(index);
			} else {
				base = ((NodeList)base).item(0);
			}
		}
		
		if (base instanceof Node) {
			Node baseNode = (Node)base;
			
			// if the property contains special characters, it is most probably an XPath expression
			if (!containsOnlyAlphaNumeric(propertyAsString)) {
				try {
					// creates the XPath expression and evaluates it
					XPath xpath = new org.jaxen.dom.DOMXPath(propertyAsString);
				    ListOfNodes l = new ListOfNodes();
				    for (Object result : xpath.selectNodes(base)) {
				    	if (result instanceof Node) {
				    		l.add((Node)result);
				    	}
				    }
				    // if we found a node then we consider the expression resolved
				    if (!l.isEmpty()) {
						if (context != null) {
							context.setPropertyResolved(true);
						}
						return l;
				    }
				    return null;
				} catch (JaxenException exc) {
					throw new ELException("Cannot compile XPath.", exc);
				}
				
			}
			
			// if the base bean is a node and has children, then get the children of which tag name is
			// the same as the given property
			if (hasChildElements(baseNode)) { 				
				ListOfNodes c = getChildrenByTagName(baseNode, propertyAsString);
				if (c != null) {
					if (context != null) {
						context.setPropertyResolved(true);
					}
					return c;
				}
			}
		}
		
		// evaluates the expression to an attribute of the base element
		if (base instanceof Element) {
			Element el = (Element) base;
			if (el.hasAttribute(propertyAsString)) {
				if (context != null) {
					context.setPropertyResolved(true);
				}
				return el.getAttribute(propertyAsString);
			}
		}
		
		return null;
	}

	@Override
	public boolean isReadOnly(ELContext context, Object base, Object property) {
		// we cannot modify the DOM
		return true;
	}

	@Override
	public void setValue(ELContext context, Object base, Object property, Object value) {
		// we don't modify the DOM
		return;
	}
	
	/**
	 * @param property the EL property 
	 * @return the property as an integer, -1 if the property is not a number 
	 */
	private int getPropertyAsIndex(Object property) {
		int index = -1;
		if (property instanceof Number) {
			index = ((Number)property).intValue();
		} else if (property instanceof String) {
			try {
				index = Integer.parseInt(property.toString());
			} catch (NumberFormatException exc) {
			}
		}
		return index;
	}
	
	/**
	 * @param s the string to be tested
	 * @return true if the given string contains only alphanumeric characters, false otherwise
	 */
	private boolean containsOnlyAlphaNumeric(String s) {
		for (int i = 0, n = s.length(); i < n; i++) {
			if (!Character.isLetterOrDigit(s.codePointAt(i))) {
				return false;
			}
		}
		return true;
	}
	
	/**
	 * @param el the element to search for children with a given tag name
	 * @param tagChildName the tag name 
	 * @return the first element with the given tag name
	 */
	private ListOfNodes getChildrenByTagName(Node el, String tagChildName) {
		ListOfNodes l = new ListOfNodes();
		NodeList children = el.getChildNodes();
		for (int i = 0, n = children.getLength(); i < n; i++) {
			Node c = children.item(i);
			if (c instanceof Element) {
				Element ce = (Element)c;
				if (tagChildName.equals(ce.getTagName())) {
					l.add(ce);
				}
			}
		}
		return l.isEmpty() ? null : l;
	}

	/**
	 * @param el the DOM element
	 * @return true if the given DOM element has at least one children element, false otherwise
	 */
	private boolean hasChildElements(Node el) {
		NodeList children = el.getChildNodes();
		for (int i = 0, n = children.getLength(); i < n; i++) {
			if (children.item(i) instanceof Element) {
				return true;
			}
		}
		return false;
	}
	
	/**
	 * Encapsulates a list of nodes to give EL the opportunity to work with as with a normal collection.
	 * Also it evaluates to a string, as the first node text content.
	 */
	@SuppressWarnings("serial")
	private static class ListOfNodes extends ArrayList<Node> {
		@Override
		public String toString() {
			return get(0).getTextContent();
		}
	}
	
}

This entire code practically instructs EL engine how to evaluate on a DOM element a property, either to a child tag name or to an XPath.

Register an EL Resolver

Before using it, you need to register your resolver. The below code does this and you can put it into a ServletContextListener, a ServletContextAware Spring bean or other kind of method that registers it at the application statup.

Using a ServletContextAware bean
import javax.el.ELResolver;
import javax.servlet.ServletContext;
import javax.servlet.jsp.JspApplicationContext;
import javax.servlet.jsp.JspFactory;

import org.springframework.web.context.ServletContextAware;

/** Registers the specified ELResolver's in the JSP EL context. */
public class ELResolversRegistrar implements ServletContextAware {

	/**
	 * The EL resolvers to be registered.
	 */
	private ELResolver[] resolvers;
	
	/** 
	 * Creates a registrar with the given resolvers.
	 * @param resolvers the EL resolvers to be registered 
	 */
	public ELResolversRegistrar(ELResolver... resolvers) {
		this.resolvers = resolvers;
	}

	/** 
	 * Creates a registrar with the given resolver.
	 * @param resolver the EL resolver to be registered 
	 */
	public ELResolversRegistrar(ELResolver resolver) {
		this(new ELResolver[]{resolver});
	}

	public void setServletContext(ServletContext servletContext) {
        JspApplicationContext jspContext = JspFactory.getDefaultFactory().getJspApplicationContext(servletContext);
        for (ELResolver resolver : resolvers) {
        	jspContext.addELResolver(resolver);
        }
	}
}

and the code in the Spring context XML

	<bean class="ELResolversRegistrar">
		<constructor-arg>
			<bean class="DomELResolver"/>
		</constructor-arg>
	</bean>
Using a ServletContextListener
import javax.servlet.ServletContextEvent;
import javax.servlet.ServletContextListener;
import javax.servlet.jsp.JspApplicationContext;
import javax.servlet.jsp.JspFactory;

public class DomELResolverRegistrarListener implements ServletContextListener {

	public void contextInitialized(ServletContextEvent sce) {
        JspApplicationContext jspContext = JspFactory.getDefaultFactory()
        		.getJspApplicationContext(sce.getServletContext());
       	jspContext.addELResolver(new DomELResolver());
	}

	public void contextDestroyed(ServletContextEvent sce) {
		// do nothing
	}

}

and in web.xml

	<listener>
		<listener-class>DomELResolverRegistrarListener</listener-class>
	</listener>

It is obvious why I prefer the first version: easy extensible and you can have only one registrar class for multiple resolvers.

And in the end some examples how you can use the resolver in a JSP

Use an EL Resolver

<%@ taglib prefix="c" uri="http://java.sun.com/jsp/jstl/core" %>
<%@ taglib prefix="x" uri="http://java.sun.com/jsp/jstl/xml" %>
<x:parse varDom="doc">
<html>
<head>
	<title>Hello</title>
</head>
<body>
	<ul>
		<li>Item 1</li>
		<li>Item 2</li>
	</ul>
</body>
</html>
</x:parse>
 
Title: ${doc.html.head.title} = ${doc["//title"]}

<br/>
Items: 
<c:forEach var="i" items="${doc.html.body.ul.li}">
 	${i.textContent}
</c:forEach>
 
<br/>
First item: ${doc.html.body.ul.li}

You can modify the resolver to even evaluate CSS selectors. But that’s your homework. Or a future story.

Categories: Software Tags: ,

Install Ubuntu 12.04 on HP EliteBook 8560w

June 10, 2012 27 comments

I got an HP EliteBook 8560w and besides Windows 7, I also wanted to install Ubuntu, specifically Ubuntu Desktop, latest version 12.04.

I downloaded the distribution image and follow the procedure from here http://www.ubuntu.com/download/help/create-a-usb-stick-on-windows. But my findings in this article also applies if you install from a CD or SD card. EliteBook 8560w has an integrated SD card reader and you can boot from it. To create an SD card with the install image on it follow the same procedure as for the USB stick.

And now you have the CD/USB stick/SD card with the Ubuntu image on it. And now you should follow the next steps:

  1. Restart your laptop.
  2. Press F10 (or Escape and then F10) to open BIOS settings menu.
  3. Reset to default BIOS settings by selecting this option in the first tab and confirm the action.
  4. Use the left/right arrow keys to go to the System configuration tab.
  5. Using up/down arrow keys, select BIOS boot order.
  6. Make sure CD/USB Hard drive/SD card is selected, depending on your install storage medium.
  7. You can also modify the boot order.
  8. Save changes and exit.
  9. If you haven’t modify the boot order accordingly, press F9 (or Escape and then F9) and select the medium from which to boot – CD/USB stick/SD card with the Ubuntu image on.
  10. Your Ubuntu installer will start. If boot procedure is stuck after displaying one line of text, restart from step 1.
  11. Select from the Ubuntu installer menu “Advanced settings”.
  12. Press F1 and then F6. It will open a command prompt.
  13. Type live nouveau.modeset=0. It will start the Live Ubuntu on your stick.
  14. Start the installer by clicking on the icon on the desktop.
  15. Now follow the usual installer procedure.
  16. Wait until your system restarts.
  17. When you enter the GRUB menu, press ‘e’ to edit the commands. Later update: As per one of the comments below, you may need to press Shift to enter GRUB menu.
  18. Add the nouveau.modeset=0 kernel parameter to the linux command (after quiet splash)
  19. Boot your system and log in
  20. Now you have to add the nouveau.modeset=0 kernel parameter to the GRUB options.
    1. Edit the /etc/default/grub file and add the kernel parameter to the GRUB_CMDLINE_LINUX_DEFAULT="quiet splash" line, like GRUB_CMDLINE_LINUX_DEFAULT="quiet splash nouveau.modeset=0".
    2. On the command line run sudo update-grub
    3. Alternatively you can install GRUB Customizer.
  21. Install the nVidia drivers
    1. Go to System Settings.
    2. Select Additional drivers and install the Recommended one.

Just to summarize a few things to keep in mind:

  • BIOS settings needs to be reset to defaults.
  • The install medium needs to be marked as bootable in BIOS.
  • Do not start with the installer directly, but with the live Ubuntu and start the installer from there.
  • The Linux boot parameter nouveau.modeset=0 needs to be set. Alternatively you can try with acpi=off. This workaround is necessary only for the laptops that ship with nVidia chipset, which is the usual case.
  • nVidia additional drivers must be installed after the usual Ubuntu installation.

That’s it!

Categories: Software

Documentation – do I have to write it?

February 2, 2012 1 comment

I heard some time ago someone saying half seriously: “It was hard to write, why it should be easy to understand?”
Leaving joke aside, is documentation necessary? I would definitely say YES. And there is no solid reason against it. Let’s discuss the possible concerns that you may have against.

Power position. If you are the only one that knows the inner workings of the code, you are tempted to say that you are on the safe side. True, but maybe if you are the only one working on that project, which I would say that anyway puts you in a power position. But as nowadays everything happens in a team, sooner or later someone would like/have to understand and use your code. And then the so called job security may resume to your value and the value of your code. Which you will further see it is heavily influenced by the documentation. And anyway, personally, I like to be measured by both the value that I brought and the one that I can still bring in the future.

Lost time. This is the most common argument when it comes to documentation. You will hear the developer saying “I barely have the time writing the code, I don’t have time to also document it.” In a marathon it is said that if you don’t know how to dose your effort, whatever time you will make in the beginning by running much faster than usual, you will lose twice in the end. In software development, I would say that it is even worst. If you don’t document, that ratio can be even 1:10. But why?

To better understand let’s see the pro reason – why to document your code?
Better understanding. When you document your code, you explain it in a more formal way. It’s like you will teach a class of students about your application inner workings. If the application architecture and coding are not clear, it will be harder to explain. This way architectural or coding issues will surface right from the beginning. That is if you document right from the beginning :).

Shorter learning curve. New members in the team or in the project will be able to easily understand and catch up. And most importantly they will be able to do this on their own without wasting other people’s time on explanations. And if the development team is split between multiple projects, you’ll be able to share resources much easier.

Easier maintenance. The nice part is that the documentation could even help yourself. If you don’t touch an application for at least six months, you will thank yourself if you documented it.

Sooner or later writing documentation will pay off. To make it sooner, in the end, I would like to share with you a few guidelines about writing documentation.

  • Be clear and concise. Write only as much as it is necessary to understand the system. Write documentation like you will explain it to the others, not to yourself.
  • Don’t make assumptions. Don’t think that some parts are already known unless you explained them before. Suppose that the reader doesn’t know anything else, except what’s in the documentation.
  • Write only if it’s adding value. JavaDoc comments like A constructor. or The XClass class. or Sets the name (for setName method) does not say anything that you didn’t know already. In my view if the name is self-explanatory, the documentation is not necessary. But instead just describe the specifics.
  • Use references where appropriate. Reuse documentation the same way you reuse code and don’t describe the same thing twice. It was already documented even on an external resource, reference that one instead of explaining the entire thing. At most you can include a short summary. If you’re using the Spring Controller in your web application, does not make any sense to explain the entire Spring MVC.
  • Use easy-to-use tools. Write documentation using tools that are easy to use and to which your team members are acquainted and confortable with. Make documenting really easy. To get developers write documentation you already have to fight with a somewhat negative mindset, don’t make your life harder and fight against the tools too.
  • Use collaborative tools. Do not write documentation in Word documents, PowerPoint presentations or similar and do not send any documents by email. Use a central place for your documentation and use references in your communications. This will help you consolidate your effort and change the mindset. Shortly you will see that everyone will refer to that central documentation point instead of just asking different people.
  • Leverage templates. If you already made a habit of writing documentation some patterns and templates will emerge. Use them – reuse is always good.

Most of the unsuccessful open source projects are in this position because of lack of documentation. Many commercial projects are over time and over budget because of the same reason. Don’t say it’s not worth, unless you REALLY tried.

Categories: Software

Generic project management – really?

September 25, 2011 Leave a comment

I heard a lot of people saying that a PM is a PM. And if you manage one type of projects, you can manage any type of project.
But I don’t think that’s true. And I will argue this opinion in the current article.

First of all let me tell you that I understand where this idea is coming from. The reason behind is that a PM should have only knowledge about managing a project and he can easily apply the methods and processes to any kind of project. Far from true. The real value of a PM comes from taking the theoretical tools and processes and tailor them to fit the specifics of the project and company. Without it, a PM is no more than a glorified secretary. Or can be easily replaced by an automated software tool.

Another thing that is easily looked over is the field expertise. The argument is that a PM should not be an SME (Subject Matter Expert). True, but he should be able to understand one, to have a common language. Let me give you one example. In a software project (I will mostly use examples from this field as this is to what I’m acquainted with) a PM can easily have access to SME knowledge through a senior developer, team leader, architect etc. But a PM should also play the role of a mediator/negotiator between the resources/teams involved in a project. What will happen when two SMEs from two different teams (like QA and dev) don’t reach an agreement because what it lacks is a common ground, a common language. Then the PM should step into the scene and lay out this common ground. But if he lacks the minimal expertise in the field to understand both languages and bridge the teams …
In business, there is a saying: a worst decision is better than a late decision. Same in project management. And sometime, the PM will end up in the position of taking hard decisions for the well being of the project. Sometime these decisions are at the borderline of two SMEs.

Just for the fun let’s see a discussion between a PM with a background in construction and a developer.
John (developer): – Mike, the database broke just two hours ago.
Mike (PM): – But this should not be a problem. We will order one, you’ll unscrew the old one and mount the new one.
John (developer): [Thinking: Ohh geez, what a %&#$]
And we can laugh about the other side of the coin too. But my expertise in the construction field does not allow me to make a good one :). Of course, the example is exaggerated and this exact situation can be easily handled by an SME, but I think you got the point.

And not to forget the human factor – the professionals in the team can feel frustrated and not understood if the PM does not employ a common language and a minimal expertise in their field.

So you can easily see, why a PM is a little bit of everything plus a PM. Of course, real life examples can easily contradict me (and any of us knows at least a few), but the actual added value of the PM in those cases is worth pennies.

Categories: Software

XPath2 functions in Java

August 10, 2011 2 comments

Some time ago I wrote an article about how to implement custom functions in XSLT/XPath in order to bring XPath 2 functions to XSLT 1 engine in .Net. Now it’s Java turn. And this is actually much easier.

First of all, how do you define Java custom functions in XSLT. Opposed to .Net, no workaround is needed, you simply define a namespace for all the public static methods in a class. And then you can easily access all these methods.

<xsl:stylesheet version="1.0" xmlns:xsl="http://www.w3.org/1999/XSL/Transform" xmlns:xpath2="java:com.wordpress.beradrian.XPath2Utilities">
...
<!-- select the XHTML elements of which CSS class is 'one' or 'two' -->
<xsl:for-each select="descendant::*[xpath2:matches(string(@class), '(\b(one|two))*')]">
...
</xsl:stylesheet>

Now all you have to do is write the com.wordpress.beradrian.XPath2Utilities class and its static method – one of them is matches.

package com.wordpress.beradrian;

import java.util.regex.Matcher;
import java.util.regex.Pattern;

/**
 * Emulates XPath2 functions extending XSLT 1.0. capabilities.
 */
public class XPath2Utilities {
	public static String replace(String input, String pattern, String replacement) {
		return input.replaceAll(pattern, replacement);
	}
	
	/**
	 * Matches an input string against a regular expression
	 * @param input
	 * @param pattern the regular expression pattern
	 * @return true if the input matches the given regular expression, false otherwise
	 */
	public static boolean matches(String input, String pattern) {
		try {
			Pattern p = Pattern.compile(pattern);
			Matcher m = p.matcher(input);
			return m.find();
		} catch (Exception e) {
			return false;	
		}
	}
	
	/**
	 * Compares two strings.
	 * @param first first string to compare
	 * @param second second string to compare
	 * @return a negative value if first string is lexicographically smaller than the second one, 
	 * 0 if they're equal and positive if first one is bigger than second one
	 */
	public static int compare(String first, String second) {
		return first.compareTo(second);
	}

	/**
	 * @param first the full string
	 * @param second the string to check if it is a suffix of the first one.
	 * @return true if the second string is a suffix for the first one 
	 */
	public boolean endsWith(String first, String second) {
		return first.endsWith(second);
	}	
}

And now let’s see how you can put the pieces together and test the result. I’m using Eclipse and starting with version 3.4 (I think, I’m using Indigo anyway) it has an XSLT debugger integrated. In Project Explorer view, right click on an XSLT file and select Run As … / XSL Transformation. Then choose the XML file and the XML result will be generated. As this is as any run/debug item you can easily configure it. Go to Run / Run configurations, select the Classpath tab and then add your project, the one containing the class com.wordpress.beradrian.XPath2Utilities. You can also check the log in Console view.

For reference you can check XPath functions 1 and 2.

Categories: Software

Overriding Spring Beans

May 8, 2011 7 comments

Spring IoC container is a very powerful configuration tool. So powerful that became a de-facto industry standard. I would choose it without second doubts in almost any project, especially bigger ones.
Still there is a feature in Spring that I would consider it confusing. With Spring IoC Container you can override beans. Very simple actually. Just define another bean with the same id. Spring will consider the last definition and ignore all the others before. You can have the following scenarios

a.xml
		<beans>
			<bean class="java.lang.String" id="x">
				<constructor-arg>Bean from A</constructor-arg>
			</bean>
		</beans>
b.xml
		<beans>
			<import resource="a.xml"/>
			<bean class="java.lang.String" id="x">
				<constructor-arg>Bean from B</constructor-arg>
			</bean>
		</beans>
web.xml
		<context-param>
			<param-name>contextConfigLocation</param-name>
			<param-value>classpath:b.xml</param-value>
		</context-param>
a.xml
		<beans>
			<bean class="java.lang.String" id="x">
				<constructor-arg>Bean from A</constructor-arg>
			</bean>
		</beans>
b.xml
		<beans>
			<bean class="java.lang.String" id="x">
				<constructor-arg>Bean from B</constructor-arg>
			</bean>
		</beans>
web.xml
		<context-param>
			<param-name>contextConfigLocation</param-name>
			<param-value>classpath:a.xml,classpath:b.xml</param-value>
		</context-param>

In both cases the Bean from A is overridden by the Bean from B. In the first case a.xml is imported in b.xml and only a.xml is referenced from web.xml and in the second case both a.xml and b.xml are referenced from web.xml, but b.xml is the last one.

If you have many configuration files and many modules this can be somewhat confusing. Personally I would add an attribute named, let’s say, override, that must be set to true if you want that the current bean override another bean definition. If not, then an exception will be thrown during configuration load. (default value will be false).
I also encountered another issue with Spring configuration: having the same configuration file name in different classpath locations (aka jars). This lead to undeterministic behavior (unlike the first case) resulting in loading one of the two files. But this one is clearly bad practice from the programmer side. I usually keep my Spring configuration files along with the sources. So in the package com.my.package I have a file package-ctx.xml. This can be referenced in Spring as classpath:/com/my/package/package-ctx.xml. I would not recommend names like spring-ctx.xml or general.xml. They are too general and counterintuitive.

Categories: Software Tags:

Code analysis and review

January 19, 2011 6 comments

In a big team, with many developers involved, code review is essential. But do you have the tools to do it?
Certainly. As I was lately using Eclipse (and I want to do as much as possible in one place) I will further describe some plugins that will help you accomplish that.

Code analysis

I will start with some automated code analysis tool.
I’m using FindBugs and PMD. My goal is to have 100% clear report from FindBugs. On PMD, things are a little bit different, and this plugin is reporting even the slightest problems, even though they aren’t of real concern. But this tool has a useful feature, Find Suspect Cut and Paste to find duplicate code. It does not always do the best job, but increasing the tile size can help.

If you need report metrics about your project a good choice can be Metrics. It calculates a lot of metrics, it can do a dependecy analysis, export the reports into XML.

All these tools can be easily installed in Eclipse from the update sites. Then they will provide contextual menu items to run the analysis on file, folders or the entire project, displaying the results in specific views. They can be integrated in your build process, running as Ant tasks and generating reports in various formats, usually XML.

Code review

I love automation, but human intervention is the ultimate factor. In this case – code review. And the tool here is Jupiter. I must say that I love this tool – easy to use and right on target. Unlike other similar tools, it does not require an additional web application or any additional software. It stores everything on files, which can be maintained in your source versioning system, to coordinate and communicate the review between team members.
I will enter here in a few more details.
After installing Jupiter like any other Eclipse plugins and restarting the environment, you will have to setup a review, to keep everything under the same roof. So, click on the project that you want to review, then Properties > Review > New … A wizard will start to configure your review (you can edit these settings at a later time). Here you can setup

  • the review name and description;
  • the files to be reviewed – you can leave blank for all;
  • the reviewers – I would suggest to use the developer’s email addresses;
  • review folder (where Jupiter stores all the review data) – I would suggest to use /review or /.review and to store this into your versioning system, along with the file /.jupiter (where Jupiter stores the metadata about all the reviews). Be careful, if you keep the latter on the versioning system, that one contains who is the author of the review, which should be different on each machine;
  • issue types and severities – the default ones seem pretty good and they’re explained here;
  • the author of this review;
  • the default values for fields like type or severity;
  • filters for the issues for different phases of review: individual – when issues are submitted, team – when issues are discussed and assigned and rework – when issues are addressed

Now you are all set up and good to … REVIEW. Just go to a line in a file in editor, right click and select Add Review Issue … The Review Editor view will open and there you can enter all the details. You can browse the existing issues using the Review Table view and navigate directly to an issue in the file, notice the purple icon on the left side of the editor or the small purple line on the right side. Another useful feature is the ability to customize different filters, even different for each review phase.

If you will store your review files on SVN (or other versioning system), after update/checkout, don’t forget to modify in the configuration that you are the reviewer (and not the one that committed the last version). Just go to Project Properties > Review > Select a review > Edit > Author and select your name from the list. Also after update don’t forget to refresh the project and the review.

I like Jupiter, because it does not require another server or software installation, it keeps everything in some XML files so you can use your favorite versioning system and it is easy to use and straightforward. It is like a local code issue tracker, but you can also synchronize across a team using, let’s say, Subversion. And, it is integrated in Eclipse and you can do everything in one place.

Summary

Tool Eclipse update site
FindBugs http://findbugs.cs.umd.edu/eclipse/
PMD http://pmd.sf.net/eclipse
Metrics http://metrics.sourceforge.net/update
Jupiter http://jupiter-eclipse-plugin.googlecode.com/svn/trunk/site/
Categories: Software Tags: