I made a change in the blogger configuration to ease the later work when blogging. It is possible that older entries are not correctly formatted.

Showing posts with label tools. Show all posts
Showing posts with label tools. Show all posts

Saturday, 2 June 2012

NetBean

After I had some discussions with some friends about their works as developers. They mentioned that they were using NetBeans as IDE. Curious about it. I downloaded it. I tried some JSF example after trying to create a Rails project failed because the grails website was not available.

My first impressions are not bad. It seems to be a quite nice and quick interface. But I am a little bit lost.

So what should I do and look at to study this new (to me) IDE.

The following points seem important to me:

  • editor issues, error, warnings as well as suggestions
  • Maven integration
  • SCM integration, i.e svn, git
  • debugger integration
  • servers creation and use

These are the points that seem important to me.
But I have the feeling that the list is not yet complete enough... Not even in the pareto sense.

Thursday, 6 May 2010

AXIOM - an Apache Stax Parser

I will have to take a look at Axiom which provides a Stax implementation to access XML info sets. It was developed for Axis 2. But it can be used independantly.

Apache Tika - Content and Metadata Extraction in Java

Apache Tika is an useful tool to extract text and metadata from a number of formats.

For example, you have a document pdf, doc,... on the web from which you wish to extract some part. Then you can use tika to extract some part. For this you can use tika:

curl http:urltodoc/.../document.pdf | java -jar tika-app/target/tika-app-0.7.jar --text
produces the text of the document. Other options exist to return an html, an xhtml or only the metadata of the document.

Maven Integration

As for other maven projects, you can specify the dependency in the pom. Note however, that depending on your needs, you might want to specify one of these ( mostly quoted from this page):

  • tika-core/target/tika-core-0.7.jar Tika core library. Contains the core interfaces and classes of Tika, but none of the parser implementations. Depends only on Java 5.
  • tika-parsers/target/tika-parsers-0.7.jar Tika parsers. Collection of classes that implement the Tika Parser interface based on various external parser libraries.
  • tika-app/target/tika-app-0.7.jar Tika application. Combines the above libraries and all the external parser libraries into a single runnable jar with a GUI and a command line interface.
  • tika-bundle/target/tika-bundle-0.7.jar Tika bundle. An OSGi bundle that includes everything you need to use all Tika functionality in an OSGi environment.

<dependency>
  <groupId>org.apache.tika</groupId>
  <artifactId>tika-core</artifactId>
  <version>0.7</version>
</dependency>
If you want to use Tika to parse documents (instead of simply detecting document types, etc.), you'll want to depend on tika-parsers instead:
<dependency>
  <groupId>org.apache.tika</groupId>
  <artifactId>tika-parsers</artifactId>
  <version>0.7</version>
</dependency>

Sunday, 2 November 2008

Oprofile - What are thou ?

O.K Apart from doing stupid references to films I actually have not really liked. What am I doing?

After reading a few mails from the Linux Kernel Mailing list, I found the following tool which seems useful: oprofile. I must admit I still do not have a clear idea of all the possibilities that this tool offers.

The first thing to know it is a profiler and it is a profiler capable of giving a number of information on a program with quite a low overhead. But what is a profiler?

A profiler collects information on running processes of a computer to analyze the performance of a program (see the wikipedia article on performance analysis).

It gives the possibility to look at the following aspects of a system:

  • Call-graphs (i.e it looks at the graph of the functions calls of programs)
  • libraries used by a program
  • instruction level information collections (also on assembly code level)
I will probably continue to take a few look at the possibilities of this tool in the next few weeks.