List of All Possible Maven Interview Questions & Answers

maven-interview-questions-answers

Is there a way to use the current date in the POM?
Take a look at the buildnumber plugin. It can be used to generate a build date each time I do a build, as follows:
org.codehaus.mojo
maven-buildnumber-plugin
0.9.4

{0,date,yyyy-MM-dd HH:mm:ss}
timestamp

false
false

validate
create

pom.xml or settings.xml? What is the best practice configuration usage for these files?
The best practice guideline between settings.xml and pom.xml is that configurations in settings.xml must be specific to the current user and that pom.xml configurations are specific to the project.
For example, in pom.xml would tell all users of the project to use the specified in the pom.xml. However, some users may prefer to use a mirror instead, so they’ll put in their settings.xml so they can choose a faster repository server.
so there you go:
settings.xml -> user scope
pom.xml -> project scope

How do I indicate array types in a MOJO configuration?

value1
value2

How should I point a path for maven 2 to use a certain version of JDK when I have different versions of JDK installed on my PC and my JAVA_HOME already set?
If you don’t want to change your system JAVA_HOME, set it in maven script instead.
How do I setup the classpath of my antrun plugin to use the classpath from maven?
The maven classpaths are available as ant references when running your ant script. The ant reference names and some examples can be found here: maven-antrun-plugin
Is it possible to use HashMap as configurable parameter in a plugin? How do I configure that in pom.xml?
Yes. Its possible to use a HashMap field as a parameter in your plugin. To use it, your pom configuration should look like this:

yourvalue
…..

How do I filter which classes should be put inside the packaged jar?
All compiled classes are always put into the packaged jar. However, you can configure the compiler plugin to exclude compiling some of the java sources using the compiler parameter excludes as follows:


org.apache.maven.plugins
maven-compiler-plugin

**/NotNeeded*.java


How can I change the default location of the generated jar when I command “mvn package”?
By default, the location of the generated jar is in ${project.build.directory} or in your target directory.
We can change this by configuring the outputDirectory of maven-jar-plugin.

org.apache.maven.plugins
maven-jar-plugin

${project.build.directory}/
How does maven 2 implement reproducibility?

Add the exact versions of plugins into your pluginDepenencies (make use of the release plugin)

Make use of ibiblio for your libraries. This should always be the case for jars. (The group is working on stabilising metadata and techniques for locking it down even if it changes. An internal repository mirror that doesn’t fetch updates (only new) is recommended for true reproducibility.)

Why there are no dependency properties in Maven 2?
They were removed because they aren’t reliable in a transitive environment. It implies that the dependency knows something about the
environment of the dependee, which is back to front. In most cases, granted, the value for war bundle will be the same for a particular
dependency – but that relies on the dependency specifying it.
In the end, we give control to the actual POM doing the building, trying to use sensible defaults that minimise what needs to be
specified, and allowing the use of artifact filters in the configuration of plugins.

What does aggregator mean in mojo?
When a Mojo has a @aggregator expression, it means that It can only build the parent project of your multi-module-project, the one who has the packaging of pom. It can also give you values for the expression ${reactorProjects} where reactorProjects are the MavenProject references to the parent pom modules.
Where is the plugin-registry.xml?
From the settings.xml, you may enable it by setting to true
and the file will be in ~/.m2/plugin-registry.xml
How do I create a command line parameter (i.e., -Dname=value ) in my mojo?
In your mojo, put “expression=${}” in your parameter field

/**
* @parameter expression=”${expression.name}”
*/
private String exp;

You may now able to pass parameter values to the command line.
“mvn -Dexpression.name=value install”
How do I convert my from Maven 1 to Maven 2?
In m1, we declare reports in the pom like this:

maven-checkstyle-plugin
maven-pmd-plugin
In m2, the tag is replaced with

org.apache.maven.plugins
maven-checkstyle-plugin

org.apache.maven.plugins
maven-pmd-plugin

 

What does the “You cannot have two plugin executions with the same (or missing) elements” message mean?
It means that you have executed a plugin multiple times with the same . Provide each with a unique then it would be ok.
How do I add my generated sources to the compile path of Maven, when using modello?
Modello generate the sources in the generate-sources phase and automatically adds the source directory for compilation in maven. So you don’t have to copy the generated sources. You have to declare the modello-plugin in the build of your plugin for source generation (in that way the sources are generated each time).
What is Maven’s order of inheritance?

parent pom

project pom

settings

CLI parameters

where the last overrides the previous.
How do I execute the assembly plugin with different configurations?
Add this to your pom,


org.apache.maven.plugins
maven-assembly-plugin

1
install
assembly

src/main/descriptors/bin.xml
${project.build.finalName}-bin

2
install
assembly

src/main/descriptors/src.xml
${project.build.finalName}-src

 

and run mvn install, this will execute the assembly plugin twice with different config.
How do I configure the equivalent of maven.war.src of war plugin in Maven 2.0?


org.apache.maven.plugins
maven-war-plugin

How do I add main class in a generated jar’s manifest?
Configure the maven-jar-plugin and add your main class.

org.apache.maven.plugins
maven-jar-plugin

com.mycompany.app.App

What does the FATAL ERROR with the message “Class org.apache.commons.logging.impl.Jdk14Logger does not implement Log” when using the maven-checkstyle-plugin mean?
Checkstyle uses commons-logging, which has classloader problems when initialized within a Maven plugin’s container. This results in the above message – if you run with ‘-e’, you’ll see something like the following:

Caused by: org.apache.commons.logging.LogConfigurationException: org.apache.commons.logging.LogConfigurationException: Class org.apache.commons.logging.impl.Jdk14Logger does not implement Log

buried deep in the stacktrace.
The only workaround we currently have for this problem is to include another commons-logging Log implementation in the plugin itself. So, you can solve the problem by adding the following to your plugin declaration in your POM:



maven-checkstyle-plugin

log4j
log4j
1.2.12

 

While this may seem a counter-intuitive way of configuring a report, it’s important to remember that Maven plugins can have a mix of reports and normal mojos. When a POM has to configure extra dependencies for a plugin, it should do so in the normal plugins section.
We will probably try to fix this problem before the next release of the checkstyle plugin.
UPDATE: This problem has been fixed in the SVN trunk version of the checkstyle plugin, which should be released very soon.
Plugins and Lifecycle, Sites & Reporting, Errors
How do I determine the stale resources in a Mojo to avoid reprocessing them?
This can be done using the following piece of code:

// Imports needed
import org.codehaus.plexus.compiler.util.scan.InclusionScanException;
import org.codehaus.plexus.compiler.util.scan.StaleSourceScanner;
import org.codehaus.plexus.compiler.util.scan.mapping.SuffixMapping;

// At some point of your code
StaleSourceScanner scanner = new StaleSourceScanner( 0, Collections.singleton( “**/*.xml” ), Collections.EMPTY_SET );
scanner.addSourceMapping( new SuffixMapping( “.xml”, “.html” ) );
Set staleFiles = (Set) scanner.getIncludedSources( this.sourceDirectory, this.targetDirectory );

The second parameter to the StaleSourceScanner is the set of includes, while the third parameter is the set of excludes. You must add a source mapping to the scanner (second line). In this case we’re telling the scanner what is the extension of the result file (.html) for each source file extension (.xml). Finally we get the stale files as a Set calling the getIncludedSources method, passing as parameters the source and target directories (of type File). The Maven API doesn’t support generics, but you may cast it that way if you’re using them.
In order to use this API you must include the following dependency in your pom:

org.codehaus.plexus
plexus-compiler-api
1.5.1
Is there a property file for plug-in configuration in Maven 2.0?
No. Maven 2.x no longer supports plug-in configuration via properties files. Instead, in Maven 2.0 you can configure plug-ins directly from command line using the -D arguement, or from the plug-in’s POM using the element.
How do I determine which POM contains missing transitive dependency?
run “mvn -X”
How do I integrate static (x) html into my Maven site?
You can integrate your static pages in this several steps,

Put your static pages in the resources directory, ${basedir}/src/site/resources.

Create your site.xml and put it in ${basedir}/src/site. An example below:

Maven War Plugin
http://maven.apache.org/images/apache-maven-project.png
http://maven.apache.org/

http://maven.apache.org/images/maven-small.gif

 

 

${reports}

Link the static pages by modifying the

section, create items and map it with the filename of the static pages.

How do I run an ant task twice, against two different phases?
You can specify multiple execution elements under the executions tag, giving each a different id and binding them at different phases.

maven-antrun-plugin

* one*
generate-sources

run

*two*
package

* *

run

Can a profile inherit the configuration of a “sibling” profile?
No. Profiles merge when their ID’s match – so you can inherit them from a parent POM (but you can’t inherit profiles from the same POM).
Inheritence and Interpolation, Plugins and Lifecycle, POM
How do I invoke the “maven dist” function from Maven 1.0, in Maven 2.0?
mvn assembly:assembly
See the Assembly Plugin documentation for more details.
General, Plugins and Lifecycle
How do I specify which output folders the Eclipse plugin puts into the .classpath file?


org.apache.maven.plugins
maven-eclipse-plugin

target-eclipse

What is a Mojo?
A mojo is a Maven plain Old Java Object. Each mojo is an executable goal in Maven, and a plugin is a distribution of one or more related mojos.
How to produce execution debug output or error messages?
You could call Maven with -X parameter or -e parameter. For more information, run:

mvn –help

Maven compiles my test classes but doesn’t run them?
Tests are run by the surefire plugin. The surefire plugin can be configured to run certain test classes and you may have unintentionally done so by specifying a value to ${test}. Check your settings.xml and pom.xml for a property named “test” which would like this:


test
some-value

Or


some-value

How do I include tools.jar in my dependencies?
The following code includes tools.jar on Sun JDKs (it is already included in the runtime for Mac OS X and some free JDKs).


default-tools.jar

java.vendor
Sun Microsystems Inc.

com.sun
tools
1.4.2
system
${java.home}/../lib/tools.jar

I have a jar that I want to put into my local repository. How can I copy it in?
If you understand the layout of the maven repository, you can copy the jar directly into where it is meant to go. Maven will find this file next time it is run.
If you are not confident about the layout of the maven repository, then you can adapt the following command to load in your jar file, all on one line.

mvn install:install-file
-Dfile=
-DgroupId=
-DartifactId=
-Dversion=
-Dpackaging= -DgeneratePom=true

Where:   the path to the file to load
the group that the file should be registered under
the artifact name for the file
the version of the file
the packaging of the file e.g. jar

This should load in the file into the maven repository, renaming it as needed.
How do I set up Maven so it will compile with a target and source JVM of my choice?
You must configure the source and target parameters in your pom. For example, to set the source and target JVM to 1.5, you should have in your pom :


org.apache.maven.plugins
maven-compiler-plugin
2.0.2

1.5
1.5

How can I use Ant tasks in Maven 2?

There are currently 2 alternatives:

For use in a plugin written in Java, Beanshell or other Java-like scripting language, you can construct the Ant tasks using the instructions given in the Ant documentation

If you have very small amounts of Ant script specific to your project, you can use the AntRun plugin.

Maven 2.0 Eclipse Plug-in

Plugins are great in simplifying the life of programmers; it actually reduces the repetitive tasks involved in the programming. In this article our experts will show you the steps required to download and install the Maven Plugin with your eclipse IDE.
Why Maven with Eclipse
Eclipse is an industry leader in IDE market, it is used very extensively in developing projects all around the world. Similarly, Maven is a high-level, intelligent project management, build and deployment tool provided by Apache’s software foundation group. Maven deals with application development lifecycle management.

Maven–Eclipse Integration makes the development, testing, packaging and deployment process easy and fast. Maven Integration for Eclipse provides a tight integration for Maven into the IDE and avails the following features:
· It helps to launch Maven builds from within Eclipse
· It avails the dependency management for Eclipse build path based on Maven’s pom.xml
· It resolves Maven dependencies from the Eclipse workspace withoutinstalling to local Maven repository
· It avails an automatic downloading of the required dependencies from the remote Maven repositories
· It provides wizards for creating new Maven projects, pom.xml or to enable Maven support on plain Java project
· It helps to search quickly for dependencies in Maven remote repositories
· It quickly fixes in the Java editor for looking up required dependencies/jars by the class or package name.
What do you Need?
1. Get the Eclipse Development Environment :
In this tutorial we are using the eclipse-SDK-3.3-win32, which can be downloaded fromhttp://www.eclipse.org/downloads/
2. Get Maven-eclipse-plugin-plugin :
It is available at http://mevenide.codehaus.org/maven-eclipse-plugin-plugin/

Download and Install Eclipse
First download and install the eclipse plugin on your development machine then proceed with the installation process of the eclipse-maven plugin.

A Maven 2.0 Repository: An Introduction

Maven repository Types:

Public remote external repository: This public external repository exists at ibiblio.org and maven synchronizes with this repository.

Private remote internal repository: We set up this repository and make changes in the maven’s pom.xml or settings.xml file to use this repository.

Local repository: This repository is maintained by the developer and stays on the developer’s machine. It is synchronous to the maven repository defined in the settings.xml file that exists in the .m2 directory at its standard location i.e. C:\Documents and Settings\Administrator. If no private internal repository is setup and not listed in the pom.xml or in the setting.xml then the local repository exists on the developer’s machine is synchronized with the public maven repository at ibiblio.org.

Advantages of having an internal private repository :

Reduces conflicts among likelihood versions.

To build first time it requires less manual intervention.

Rather than having several separate independent libraries it provides a single central reference repository for all the dependent software libraries.

It quickly builds the project while using an internal repository as maven artifacts are retrieved from the intranet server rather than retrieving from the server on internet.

Use cases for maven repository:

It creates two sub-repository inside the internal repository.

Downloads ibiblio-cache from ibiblio for artifacts and make it available publically. This synchronizes with external repository from ibiblio.

internal-maven-repository: used for internal artifacts of an organization. It contains unique artifacts for the organization and is not synchronized with any repository.

Alternatively, another sub-repository that is not at ibiblio can be created for artifacts. This does not synchronize with any external repository.

Browse the remote repository by using a web browser.

Search the artifacts in the repository.

Download code from version control and make changes in settings.xml to point to the internal repository and build without any manual intervention.

Install new version of the artifacts.

Import artifacts into the repository in bulk.

Export artifacts from the repository in bulk.

Setup the task to backup the repository automatically.

Criteria for choosing a maven repository implementation: In ideal condition a maven repository implementation should be:

Free and open source

Provide admin tools

Easy to setup and use

Provide backup facility

Able to create, edit and delete sub repositories.

Anonymous read only access and also access control facility.

Deployable in any standard web server such as Tomcat or Apache.

Issue tracker, forums and other independent source of information.

Active community developers make the product enhanced and bugs fixed.

Bulk import/export facility to move groups of artifacts into the repository and out of the repository.

Provide a repository browser: should be a web browser instead of the desktop application.

Shifting from Apache Ant to Maven

Maven is entirely a different creature from Ant. Ant is simply a toolbox whereas Maven is about the application of patterns in order to achieve an infrastructure which displays the characteristics of visibility, reusability, maintainability, and comprehensibility. It is wrong to consider Maven as a build tool and just a replacement for Ant.
Ant Vs Maven
There is nothing that Maven does that Ant cannot do. Ant gives the ultimate power and flexibility in build and deployment to the developer. But Maven adds a layer of abstraction above Ant (and uses Jelly). Maven can be used to build any Java application. Today JEE build and deployment has become much standardized. Every enterprise has some variations, but in general it is all the same: deploying EARs, WARs, and EJB-JARs. Maven captures this intelligence and lets you achieve the build and deployment in about 5-6 lines of Maven script compared to dozens of lines in an Ant build script.
Ant lets you do any variations you want, but requires a lot of scripting. Maven on the other hand mandates certain directories and file names, but it provides plugins to make life easier. The restriction imposed by Maven is that only one artifact is generated per project (A project in Maven terminology is a folder with a project.xml file in it). A Maven project can have sub projects. Each sub project can build its own artifact. The topmost project can aggregate the artifacts into a larger one. This is synonymous to jars and wars put together to form an EAR. Maven also provides inheritance in projects.
Maven : Stealing the show
Maven simplifies build enormously by imposing certain fixed file names and acceptable restrictions like one artifact per project. Artifacts are treated as files on your computer by the build script. Maven hides the fact that everything is a file and forces you to think and script to create a deployable artifact such as an EAR. Artifact has a dependency on a particular version of a third party library residing in a shared remote (or local) enterprise repository, and then publish your library into the repository as well for others to use. Hence there are no more classpath issues. No more mismatch in libraries. It also gives the power to embed even the Ant scripts within Maven scripts if absolutely essential.

Maven 2.0: Features

Maven is a high-level, intelligent project management, build and deployment tool provided by Apache’s software foundation group. Maven deals with application development lifecycle management. Maven was originally developed to manage and to minimize the complexities of building the Jakarta Turbine project. But its powerful capabilities have made it a core entity of the Apache Software Foundation projects. Actually, for a long time there was a need to standardized project development lifecycle management system and Maven has emerged as a perfect option that meets the needs. Maven has become the de- facto build system in many open source initiatives and it is rapidly being adopted by many software development organizations.
Maven was borne of the very practical desire to make several projects at Apache work in a consistence manner. So that developers could freely move between these projects, knowing clearly how they all worked by understanding how one of them worked.

If a developer spent time understanding how one project built it was intended that they would not have to go through this process again when they moved on to the next project. The same idea extends to testing, generating documentation, generating metrics and reports, testing and deploying. All projects share enough of the same characteristics, an understanding of which Maven tries to harness in its general approach to project management.
On a very high level all projects need to be built, tested, packaged, documented and deployed. There occurs infinite variation in each of the above mentioned steps, but these variation still occur within the confines of a well defined path and it is this path that Maven attempts to present to everyone in a clear way. The easiest way to make a path clear is to provide people with a set of patterns that can be shared by anyone involved in a project.

The key benefit of this approach is that developers can follow one consistent build lifecycle management process without having to reinvent such processes again. Ultimately this makes developers more productive, agile, disciplined, and focused on the work at hand rather than spending time and effort doing grunt work understanding, developing, and configuring yet another non-standard build system.
Maven: Features

Portable: Maven is portable in nature because it includes:

Building configuration using maven are portable to another machine, developer and architecture without any effort

Non trivial: Maven is non trivial because all file references need to be relative, environment must be completely controlled and independent from any specific file system.

Technology: Maven is a simple core concept that is activated through IoC container (Plexus). Everything is done in maven through plugins and every plugin works in isolation (ClassLoader). Plugings are downloaded from a plugin-repository on demand.

Maven’s Objectives:
The primary goal of maven is to allow the developers to comprehend the complete state of a project in the shortest time by using easy build process, uniform building system, quality project management information (such as change Log, cross-reference, mailing lists, dependencies, unit test reports, test coverage reports and many more), guidelines for best practices and transparent migration to new features. To achieve to this goal Maven attempts to deal with several areas like:

It makes the build process easy

Provides a uniform building system

Provides quality related project information

Provides guidelines related to development to meet the best goal.

Allows transparent migration to new features.

Introduction to Maven 2.0

Maven2 is an Open Source build tool that made the revolution in the area of building projects. Like the build systems as “make” and “ant” it is not a language to combine the build components but it is a build lifecycle framework. A development team does not require much time to automate the project’s build infrastructure since maven uses a standard directory layout and a default build lifecycle. Different development teams, under a common roof can set-up the way to work as standards in a very short time. This results in the automated build infrastructure in more stable state. On the other hand, since most of the setups are simple and reusable immediately in all the projects using maven therefore many important reports, checks, build and test animation are added to all the projects. Which was not possible without maven because of the heavy cost of every project setup.

Maven 2.0 was first released on 19 October 2005 and it is not backward compatible with the plugins and the projects of maven1. In December 2005, a lot of plugins were added to maven but not all plugins that exists for maven1 are ported yet. Maven 2 is expected to stabilize quickly with most of the Open Source technologies. People are introduced to use maven as the core build system for Java development in one project and a multi-project environment. After a little knowledge about the maven, developers are able to setup a new project with maven and also become aware of the default maven project structure. Developers are easily enabled to configure maven and its plugins for a project. Developers enable common settings for maven and its plugins over multiple projects, how to generate, distribute and deploy products and reports with maven so that they can use repositories to set up a company repository. Developers can also know about the most important plugins about how to install, configure and use them, just to look for other plugins to evaluate them so that they can be integrated in their work environment.

Maven is the standard way to build projects and it also provides various other characters like clearing the definition of the project, ways to share jars across projects. It also provides the easy way to publish project information (OOS).
Originally maven was designed to simplify the building processes in the Jakarta Turbine project. Several projects were there containing their own slightly different Ant build files and JARs were checked into CVS. An apache group’s tool that can build the projects, publish project information, defines what the project consists of and that can share JARs across several projects. The result of all these requirement was the maven tool that builds and manages the java-based-project.

Why maven is a great build tool? how does it differ from other Build tools?
Tell me more about Profiles and Nodes in Maven?
Tell me more about local repositories?
How did you configured local repositories in different environment (Development, Testing , Production etc)?
What is Transcend Dependencies in maven 2?
Did you write plugins in maven? if so what are they?
Why a matrix report is required during a new release? How does this benefit QA Team?
What are pre-scripts and post-scripts in maven? Illustrate with an example?
What are the checklists for artifacts ? and what are the checklists for source code artifact?
Tell me the experience about Static Analysis Code?

Reference:
http://www.javabeat.net

 

Tagged : / / / / / / / / / / / / /

How to Run/Deploy Java EE applications on Amazon EC2?

running-java-ee-applications-on-amazon-ec2

Running Java EE applications on Amazon EC2: deploying to 20 machines with no money down

Computer hardware has traditionally been a scarce, expensive resource. In the early days of computing developers had to share a single machine. Today each developer usually has their own machine but it’s rare for a developer to have more than one. This means that running performance tests often involves scavenging for machines.  Likewise, replicating even just part of a production environment is a major undertaking. With Amazon’s Elastic Compute Cloud (EC2), however, things are very different. A set of Linux servers is now just a web service call away. Depending on the type of the servers you simply pay 10-80 cents per server per hour for up to 20 servers! No more upfront costs or waiting for machines to be purchased and configured.

To make it easier for enterprise Java developers to use EC2, I have created EC2Deploy.  It’s a Groovy framework for deploying an enterprise Java application on a set of Amazon EC2 servers. EC2Deploy provides a simple, easy to use API for launching a set of EC2 instances; configuring MySQL, Apache and one or more Tomcat servers; and deploying one or more web applications. In addition, it can also run JMeter and collect performance metrics.

Here is an example script that launches some EC2 instances; configures MySQL with one slave, Tomcat and Apache; deploys a single web application on the Tomcat server; and runs a JMeter test with first one thread and then two.

class ClusterTest extends GroovyTestCase {
  void testSomething() {
    AWSProperties awsProperties = new
        AWSProperties("/…/aws.properties")

    def ec2 = new EC2(awsProperties)

    def explodedWar = '…/projecttrack/webapp/target/ptrack'

    ClusterSpec clusterSpec =
       new ClusterSpec()
            .schema("ptrack", ["ptrack": "ptrack"],
                    ["src/test/resources/testdml1.sql",
                     "src/test/resources/testdml2.sql"])
            .slaves(1)
            .tomcats(1)
            .webApp(explodedWar, "ptrack")
            .catalinaOptsBuilder({builder, databasePrivateDnsName ->
                 builder.arg("-Xmx500m")
                 builder.prop("com.sun.management.jmxremote")
                 builder.prop("com.sun.management.jmxremote.port", 8091)
                 builder.prop("com.sun.management.jmxremote.authenticate",
                                     false)
                 builder.prop("com.sun.management.jmxremote.ssl", false)
                 builder.prop("ptrack.application.environment", "ec2")
                 builder.prop("log4j.configuration",
                               "log4j-minimal.properties")
                 builder.prop("jdbc.db.server", databasePrivateDnsName)})

    SimpleCluster cluster = new SimpleCluster(ec2, clusterSpec)

    cluster.loadTest("…/projecttrack/functionalTests/jmeter/SimpleTest.jmx",
        [1, 2])

    cluster.stop()
  }
}

Let’s look at each of the pieces.

First, we need to configure the framework as follows:

    AWSProperties awsProperties = new
        AWSProperties("/…/aws.properties")
    def ec2 = new EC2(awsProperties)

The aws.properties file contains various properties including the Amazon WS security credentials and the EC2 AMI (i.e. OS image) to launch. All servers use my EC2 appliance AMI that has Java, MySQL, Apache, Tomcat, Jmeter and some other useful tools pre-installed.

Next we need to configure the servers:

     ClusterSpec clusterSpec =
        new ClusterSpec()
             .schema("ptrack", ["ptrack": "ptrack"],
                    ["src/test/resources/testdml1.sql",
                     "src/test/resources/testdml2.sql"])
             .slaves(1)
             .tomcats(1)
             .webApp(explodedWar, "ptrack")
             .catalinaOptsBuilder({builder, databasePrivateDnsName ->
                 builder.arg("-Xmx500m")
                 builder.prop("com.sun.management.jmxremote")
                 builder.prop("com.sun.management.jmxremote.port", 8091)
                 builder.prop("com.sun.management.jmxremote.authenticate",
                                     false)
                 builder.prop("com.sun.management.jmxremote.ssl", false)
                 builder.prop("ptrack.application.environment", "ec2")
                 builder.prop("log4j.configuration",
                               "log4j-minimal.properties")
                 builder.prop("jdbc.db.server", databasePrivateDnsName)})

     SimpleCluster cluster = new SimpleCluster(ec2, clusterSpec)

This code first creates a ClusterSpec, which defines the configuration of the machines and the applications:

  • schema() – specifies the name of the database schema to create; names of the users and their passwords; the DML scripts to execute once the database has been create
  • slaves() – specifies how many MySql slaves to create
  • tomcats() – specifies how many Tomcats to run.
  • webApp() – configures a web application. This method takes two parameters: the path to the exploded WAR directory (conveniently created by Maven) and the context to deploy the web application under.
  • catalinaOptsBuilder() – supplies a closure that takes a builder and the DNS name of the MySQL server as arguments and returns the CATALINA_OPTS used to launch Tomcat. It’s primary purpose is to configure the web application(s) to use the correct database server

It then creates a cluster with that specification.

We then start the cluster:

    cluster.start()

At this point EC2Deploy will:

  1. Launch the EC2 instances running my appliance AMI.
  2. Initialize the MySql master database
  3. Create the MySql slave
  4. Create the database schema and the users
  5. Run any DML scripts (these are cached on S3 in a bucket called “tmp–dml” for the reasons described next)
  6. Upload the web applications to Amazon S3 (Simple Storage Service) where they are cached in order to avoid time consuming uploads (over slow DSL connections, for example). EC2Deploy only uploads new and changed files, which means that the bulky 3rd party libraries are only uploaded once. Each web application is stored in an S3 bucket called -tmp-war. If this bucket does not exist you will see some warning messages and the bucket will be created.
  7. Deploy the web applications on each of the Tomcat servers
  8. Configure Apache to load balance across the Tomcat servers

Once the cluster is started we can run a JMeter load test:

    cluster.loadTest("…/projecttrack/functionalTests/jmeter/SimpleTest.jmx", [1, 2])

The first argument specifies the test to run and the second argument is a list of JMeter thread counts. In this example, EC2deploy first runs the load test with one thread and then two threads. For each test run, it generates a report describing CPU utilization for each machine, average response time and throughput.

Finally, we stop the EC2 instances:

cluster.stop()

As you can see, EC2Deploy makes it pretty easy to deploy and test your enterprise Java application. I’ve used it to clone a production environment and run load tests. NOTE 1/28/08: The source code EC2Deploy along with a very cool Maven plugin is now available !

Tagged : / / / / / / / / / / / / / / / / / /

EC2Deploy and the Cloud Tools Maven plugin are now available

ec2deploy-cloud-tools-maven-plugin

I’m pleased to announce that EC2Deploy – a Groovy-based framework for deploying Java EE applications to Amazon EC2 – is now available as part of the Cloud Tools open source project.

There are three main parts to Cloud Tools:

  • The EC2Deploy framework
  • Amazon Machine Images (AMIs) that are configured to run Tomcat and work with EC2Deploy
  • A Maven plugin that uses EC2Deploy to deploy a web application to EC2

I’m especially excited about the Maven plugin. Once you have configured the plugin for your web application you can use the following goals:

  • cloudtools:deploy – launch the EC2 instances and deploy the web application
  • cloudtools:redeploy – redeploy the web application (upload the changes and restart tomcat)
  • cloudtools:jmeter – run a Jmeter test
  • cloudtools:stop – stop the EC2 instances

Cloudtools is still work in progress but it let’s you deploy a web application on EC2 in just a few minutes.  To learn more go to Cloud Tools.

Tagged : / / / / / / / / / / / /

Cloud Tools now supports Amazon Elastic Block Store

amazon-elastic-block-store-

One of the exciting new features of Amazon EC2 is Elastic Block Store, which provides truly durable storage for your instances. Prior to EBS, the contents of the file system disappeared once an instance was terminated. This meant that if you wanted to run a database server on EC2 you had to use MySql master-slave replication with frequent backups to Amazon S3. With EBS running a database on EC2 is a lot easier. You can simply create an EBS volume, attach it to an instance, and create a filesystem that gives you long-lived disk storage for your database. Moreover, you can easily back up an EBS volume by creating a snapshot (stored in S3). And, if you ever need to restore your data, you can create a volume from a snapshot.

Cloud tools now supports Amazon EBS. You can launch an application with a database stored on a brand new volume; on an existing volume; or on a volume created from a snapshot. You can also convert an already running application to use elastic block storage. Finally, you can create an EBS snapshot of the database. Currently, only the Maven plugin supports this functionality but I plan to update the Grails plugin shortly.

Please check out the project’s home page for more information and send me feedback.

Tagged : / / / / / / / / / / / /

Amazon EC2 key pairs and other stumbling blocks – Guide

amazon-ec2-key-pairs-stumbling-blocks

While working with Cloud Tools and Cloud Foundry users, I have noticed that EC2 key pairs and security group configuration are common stumbling blocks for people who are new to Amazon EC2. When you sign up for an AWS account you get what can be, at first, a confusing set of credentials:  an access key id,  a secret access key, X509 certificate and a corresponding private key. You authenticate an AWS request using either the access key id and secret access key or the X509 certificate and private key. Some APIs and tools support both options, where was others support just one. And, to make matters worse, to launch an EC2 instance and access it via SSH you must use a (named) EC2 key pair. This EC2 key pair is not the same as the X509 certificate/private key given to you by AWS during sign up. But they are easily confused since they both consist of private and public keys.

You create a EC2 key pair by using one of the AWS tools: command line tools, ElasticFox plugin or the rather nice AWS console. Under the covers these tools make an AWS request to create the key pair.

Here is a screenshot of the AWS Console showing how you create a key pair.

Creating a Key Pair

There are three steps:

  1. Select Key Pairs
  2. Click  Create Key Pair
  3. Enter the name of the Key Pair you want to create – you chose the name

The console will then create the key pair and prompt you to save the private key.

Saving a key pair

You specify the key pair name in the AWS request that launches the instances and specify the private key file as the -i argument to ssh when connecting to the instance.Just make sure you save the key pair in safe place.

Another stumbling block is that you need to enable SSH in the AWS firewall. Both Cloud Tools and Cloud Foundry use SSH to configure the instances and deploy the application. If SSH is blocked then they won’t work. Fortunately, the AWS firewall (a.k.a. security groups) is extremely easy to configure using the AWS tools – command line tools, ElasticFox plugin or the nice AWS console – by editing the default security group to allow SSH traffic.

The good news is that these are relatively minor hurdles to overcome. Once you have sorted out your EC2 key pair and edited the security groups to enable SSH using Cloud Tools or Cloud Foundry to deploy your web application is very easy.

Tagged : / / / / / / / / / / / / / / /

Cloud Computing Trends | Cloud Adoption Analysis | Organizations

cloud-computing

We just finished the first decade of this century/millennium. The early part of this decade saw great worry about the Year 2000 problem. Much gloom and doom was predicted, but things passed off smoothly. No apocalyptic upheaval.

As we usher in the next decade, the biggest buzzword is “Cloud Computing”, a rapprochement of ASP, SaaS, SOA, Virtualization, Grid Computing, Enterprise 2.0, etc. All these buzzwords have been making the rounds over past few years. Finally, computing as a “utility” seems practical and doable. Amazon took the lead in introducing AWS (Amazon Web Services) way back in 2003. It then brought in Storage as a Service concept via S3 (Simple Shared Storage). It also introduced EC2 (Elastic Computing Cloud), where Infrastructure as a Service became viable.

I just read a nice summary of this written by M.R. Rangaswamy of the Sand Hill Group. While the momentum is on, MR says large enterprises are going to be slow adapters. Much cloud adoption is in the SMB arena where lower TCO and capex override any concern for security and scale. Older vendors like IBM will offer a hybrid model – In-house systems and cloud. This is a no-brainer, as there is a huge legacy of production systems in Fortune 1000 companies running in the premises. But “pure cloud” vendors like Google, Amazon, and SalesForce.com will push for “cloud-only” approach.

Another area of interest is data management, the volume of which has never been seen before. There is the NoSQL movement to deal with unstructured data and framework like Hadoop combined with the MapReduce algorithm is getting quick adoption for fast search.

This decade will see a big landscape change in the computing arena – from the model of computing to how we store and manage data for access and analytics.

Welcome to 2010.

Tagged : / / / / / / / / / / /

Workforce Management Software Helps Call Centers Save Money

workforce-management

One way to increase revenues in your inbound call center might be via workforce management software.

For call centers that realize revenue by answering calls (be they catalogues, reservation centers, what have you), workforce management automation can help reduce queue times and improve service, thereby reducing the number of abandoned calls and increasing revenue calls completed.

These call centers can increase revenues by tens of thousands of dollars per year in addition to the cost savings.

And since cost is a prime consideration, you’ll want to look at the SaaS (News – Alert)-based model.

Do be careful, though: “Often vendors who sell on-premise software may offer a hosted model for on-demand options and often misleadingly call it SaaS-based software,” say officials of Monet Software, which offers cloud-based WFM. “However, sometimes it’s simply a hosted client server application on a server at the vendor’s site, providing an application that was not originally designed to be hosted and delivered, with a few changes, over the Web via a single, dedicated server.”

You’ll be able to tell such impostors as they’ll almost always lack multi-tenant architecture and require separate servers and installations for each customer. In the end, Monet officials warn, they’re “much more costly and less scalable, and also usually require support for multiple releases, which is very resource intensive.”

Genuinely useful SaaS workforce management software, however, is a boon to users. A product such as Monet WFM Live uses a new multi-tenant architecture “designed to deliver Web-based applications at the lowest possible cost,” company officials say, focusing on “fast set up, low operating costs through shared services, highest security for Web-based deployment and high performance and scalability through the scaling of computer resources also called ‘elastic cloud computing.'”

This is nicely cost-effective, as it ensures available computing capacity only when you need it, at the lowest possible cost.

With SaaS there’s no large upfront investment for software and hardware either, it’s usually offered via a low monthly subscription fee that includes support, maintenance and upgrades.

And of course with the SaaS provider managing the IT infrastructure, costs are lowered by avoiding IT participation time for hardware and software issues as well as the personnel resources required for IT management. These “hidden costs” for hardware replacements, upgrades, and IT operation resources are typical for other premise-based software.

Tagged : / / / / / / / / / / / / / / / / / / / /

Running MSBuild 4.0 and MSBuild 3.5 on Continuous Integration

msbuild-40-and-msbuild-35

With Visual Studio 2010 RC released recently, we jumped on the release and began to code with VS2010.  One issue that popped up was that now all builds were targeting MSBuild 4.0.

That doesn’t seem to be a big problem until our CruiseControl CI server kicked in, downloaded our updated code and failed building the upgraded projects.

Fortunately there is a very quick solution to this little problem.  There are a couple of requirements.

1. You need to have VS2010 RC installed somewhere
2. You need to download the .Net Framework 4.0 (I recommend the full version and not just the Client Profile, it ensures you don’t miss anything)

To fix, do the following:

1. download and install the .Net Framework 4.0 on the CI server (then restart the server)
2. on the computer where VS2010 RC is installed go to the following path:
%programfiles%\MSBuild\Microsoft\VisualStudio
3. copy the v10.0 folder located in that directory into the CI server at the same path (or wherever our MSbuild path is on the CI server)
4. Once that is done, edit the ccnet.config file at the tag and change it to the new .Net 4.0 Framework installed (you should only need to change the section “\v3.5\” to “\v4.0.xxxxx\”

Hope this helps

Tagged : / / / / / / / / / / / / / / /

Issues Compiling VS2010 solutions (with web projects) from Nant | MSB4064 error

vs2010-compiling-issues

Recently I upgraded a project of mine (the Dimecasts code base) to use VisualStudio 2010.  In the process everything worked just fine from the IDE, but when I tried to compile it from the command line I would get the following errors:

Error MSB4064: The “Retries” parameter is not supported by the “Copy” task.
Error MSB4063: THe “Copy” task could be initialized with its input parameter.

After a bit a googling I came across a post which (and of course i cannot find it now) said that if you open up your .proj files and change the line that pointed to the v10.0 build of web applications and reset it back to 9.0 everything would compile.  And this did work… BUT when you try to open that project up again in VS 2010 it will simply revert your changes… this is not a working solution.

Next I decided to switch my target framework in Nant from 3.5 to 4.0, but of course my nant.exe.config file does not support 4.0 yet.  So after a bit more googling I found this post that gives details on how to add the missing values to the config file.

When I added the config information to my Nant.exe.config file things were better, but still not great.  Now I was getting an error that said:

The “vendor” attribute does not exist, or has no value.

To resolve this I added the following under the node in my config
vendor=”Microsoft”

After this I got another error…. This time it said that .Net Framework 4.0 was not installed.  But I know this is not valid.  After looking at the information for a few more seconds I realized the issue.  The example config from the post above was build on an older version of the 4.0 framework (.20506) and I have .30128.

I changed all values in the nant.exe.config value that was v4.0.20506 to be v4.0.30128 and NOW I am able to compile.

So long story short, if you are getting the MSB4064 error you need to do the following:

1. Point nant to use the 4.0 framework tools
2. Follow this post  and copy the framework section to your Nant.exe.config file
3. Add the missing ‘vendor’ attribute to the new framework section
4. Update the version in the new framework section to match the version you have on disk (check C:\Windows\Microsoft.NET\Framework for versions)
5. Compile again

Tagged : / / / / / / / / / / / / / / / / / / /

JUnit 4 Test Logging Tips using SLF4J

junit-4-test-logging-using-slf4j

When writing JUnit tests developers often add log statements that can help provide information on test failures. During the initial attempt to find a failure a simple System.out.println() statement is usually the first resort of most developers.

Replacing these System.out.println() statements with log statements is the first improvement on this technique. Using SLF4J (Simple Logging Facade for Java) provides some neat improvements using parameterized messages. Combining SLF4J with JUnit 4 rule implementations can provide more efficient test class logging techniques.

Some examples will help to illustrate how SLF4J and JUnit 4 rule implementation offers improved test logging techniques. As mentioned the inital solution by developers is to use System.out.println() statements. The simple example code below shows this method.

01    import org.junit.Test;
02
03    public class LoggingTest {
04
05      @Test
06      public void testA() {
07        System.out.println(“testA being run…”);
08      }
09
10      @Test
11      public void testB() {
12        System.out.println(“testB being run…”);
13      }
14    }

The obvious improvement here is to use logging statements rather than the System.out.println() statements. Using SLF4J enables us to do this simply whilst allowing the end user to plug in their desired logging framework at deployment time. Replacing the System.out.println() statements with SLF4J log statements directly results in the following code.
view source

01    import org.junit.Test;
02    import org.slf4j.Logger;
03    import org.slf4j.LoggerFactory;
04
05    public class LoggingTest {
06
07      final Logger logger =
08        LoggerFactory.getLogger(LoggingTest.class);
09
10      @Test
11      public void testA() {
12        logger.info(“testA being run…”);
13      }
14
15      @Test
16      public void testB() {
17        logger.info(“testB being run…”);
18      }
19    }

Looking at the code it feels that the hard coded method name in the log statements would be better obtained using the @Rule TestName JUnit 4 class. This Rule makes the test name available inside method blocks. Replacing the hard coded string value with the TestName rule implementation results in the following updated code.

01    import org.junit.Rule;
02    import org.junit.Test;
03    import org.junit.rules.TestName;
04    import org.slf4j.Logger;
05    import org.slf4j.LoggerFactory;
06
07    public class LoggingTest {
08
09      @Rule public TestName name = new TestName();
10
11      final Logger logger =
12        LoggerFactory.getLogger(LoggingTest.class);
13
14      @Test
15      public void testA() {
16        logger.info(name.getMethodName() + ” being run…”);
17      }
18
19      @Test
20      public void testB() {
21        logger.info(name.getMethodName() + ” being run…”);
22      }
23    }

SLF4J offers an improved method to the log statement in the example above which provides faster logging. Use of parameterized messages enable SLF4J to evaluate whether or not to log the message at all. The message parameters will only be resolved if the message will be logged. According to the SLF4J manual this can provide an improvement of a factor of at least 30, in case of a disabled logging statement.

Updating the code to use SLF4J parameterized messages results in the following code.

01    import org.junit.Rule;
02    import org.junit.Test;
03    import org.junit.rules.TestName;
04    import org.slf4j.Logger;
05    import org.slf4j.LoggerFactory;
06
07    public class LoggingTest {
08
09      @Rule public TestName name = new TestName();
10
11      final Logger logger =
12        LoggerFactory.getLogger(LoggingTest.class);
13
14      @Test
15      public void testA() {
16        logger.info(“{} being run…”, name.getMethodName());
17      }
18
19      @Test
20      public void testB() {
21        logger.info(“{} being run…”, name.getMethodName());  }
22      }
23    }

Quite clearly the logging statements in this code don’t conform to the DRY principle.

Another JUnit 4 Rule implementation enables us to correct this issue. Using the TestWatchman we are able to create an implementation that overrides the starting(FrameworkMethod method) to provide the same functionality whilst maintaining the DRY principle. The TestWatchman Rule also enables developers to override methods invoked when the test finishes, fails or succeeds.

Using the TestWatchman Rule results in the following code.

01    import org.junit.Rule;
02    import org.junit.Test;
03    import org.junit.rules.MethodRule;
04    import org.junit.rules.TestWatchman;
05    import org.junit.runners.model.FrameworkMethod;
06    import org.slf4j.Logger;
07    import org.slf4j.LoggerFactory;
08
09    public class LoggingTest {
10
11      @Rule public MethodRule watchman = new TestWatchman() {
12        public void starting(FrameworkMethod method) {
13          logger.info(“{} being run…”, method.getName());
14        }
15      };
16
17      final Logger logger =
18        LoggerFactory.getLogger(LoggingTest.class);
19
20      @Test
21      public void testA() {
22
23      }
24
25      @Test
26      public void testB() {
27
28      }
29    }

And there you have it. A nice test code logging technique using JUnit 4 rules taking advantage of SLF4J parameterized messages.

I would be interested to hear from anyone using this or similar techniques based on JUnit 4 rules and SLF4J.

Reference: http://www.catosplace.net/

Tagged : / / / / / / / / / / / / / / / / / /