I've been seeing more and more applications for social networks (like Hi5). What I didn't realise was that it existed a standard API for developing those applications. OpenSocial is a reference API by Google that is being used by some of the major social networks (LinkedIn, Hi5, MySpace). It makes sense as it allows a developer to create an application or gadget that works in several social networks, or even allows communication between those.
And this explains the amount of new applications that are emerging every day. Still, it amazes me how a consensus was reached to allow this to happen. I thought only W3C had the power to do that. On the other hand, Google is becoming more "standard" than anything else...
On the technical side, the OpenSocial site seems quite good, with lots of documentation and sample applications. I didn't have the time to read the API itself, but being so well documented it's a great thing. It makes me think it must be easy to implement this kind of applications. It makes me wanna implement something :)
This also makes me wonder how many interesting Google projects are there that I still never heard of...
Wednesday, March 26, 2008
Howto write an application for social networks
Friday, March 07, 2008
Exhibit 2.0 - Javascript web framework
I just discovered Exhibit and I'm amazed by its simplicity. Exhibit is a web framework written in Javascript that you can use to display data. It's not a general purpose framework. It's focused on displaying data in several formats, like interactive tables, maps, graphs, etc.
What I like about Exhibit is how simple it is to use. Everything is done in a simple HTML file! You heard me right, just HTML. The framework is done in Javascript, and it does all the data manipulation. You just have to add some special attributes to the HTML tags in order to specify how data is going to be shown. That's the presentation part of the framework. The data itself has to be in JSON format. Just add a link tag pointing at the js file with your data structure and you're all set. This is actually very flexible. In a simple case you can point to a static js file. But you can also point to an URL that dynamically generates your JSON data.
Exhibit then gives you a few presentation choices. One of them is trough Google Maps. This way you can create mash ups quite easily. Interactive tables that allow filtering and sorting also look nice.
Exhibit is part of SIMILE project, a joint effort conducted by the MIT Libraries and MIT CSAIL. So simple, but it can be quite useful...and pretty cool.
Thursday, March 06, 2008
Pyhton developers hired by Sun
This is interesting news:
Python’s future looks bright by ZDNet's Ed Burnette -- It always warms my heart to see good programmers get the recognition they deserve. This week, Sun announced they were hiring Ted Leung (long-time Python developer), and Frank Wierzbicki (lead implementer of the Jython project). They’ll be working full-time on Jython and in particular paying attention to developer tools. Ted and Frank join Charles Nutter, Thomas [...]
Although Jython is not new, this shows the current trend in Sun: to support as many languages as possible in the VM. It's nice to see that not all eggs are being put in the same basket (Ruby) and Python is also getting some attention. I specially like that because I prefer Python over Ruby. This also reminds me of putting Jython in my ToDo list of technologies to check out :)
Thursday, February 28, 2008
PDFEdit - Editing PDF in Linux
Did you ever had the need to alter a PDF file ? For me, it doesn't happen very often, but when it does I can only think of Acrobat Writer. I know I have also searched the Internet for some free tool, but it's not easy to find one. I've searched for "editing pdfs in Linux" before and come up with nothing but some technique involving saving each page to PostScript, then editing on an image editor, etc...
So it was with a bit of surprise that I found PDFEdit. It does its job well and has lots of features. At first glance seems like an application that would be very well known. But it's not the case, and it definitely deserves more spotlight. Check the screenshot below (taken from PDFEdit website):
Wednesday, February 27, 2008
Htop - Manage Your System Processes in Seconds
Almost everyday we use top to check which processes are using more cpu/memory. It's such a basic tool that I have never spend time searching for something better. Nevertheless, I never found top to be very intuitive or fast. The article Htop - Manage Your System Processes in Seconds
shows an excellent alternative. It starts much faster and it's much easier to use than top.
Just to show you an example: if you want to change the order in which processes are displayed, just press F6 and select with your up/down keys the criteria (cpu, memory, etc). Press Enter and voilá. Now try the same on top! Yes, you can do it with top almost as fast, but only if you can remember the weird shortcuts. If you use top ocasionally, then using htop is much simpler and intuitive.
Another example is killing a process: select the process with your up/down keys and press F9.
Just a note for the article author: very good blog, focused and to the point.
Tuesday, January 15, 2008
Thumb navigation and Pointui
It's funny to see the evolution in user interfaces for mobile devices. The evolution from static simples screens to touch screens was a huge step. Allowing a user to write with a plastic pen on a screen was incredible. That evolution allowed the user to do more with his mobile device, at the cost of complexity. There's no doubt that the introduction of the touch-screen and the pen increased the complexity of the user interface. And that's because it allowed the user to do more. It almost mimics the desktop computer nowdays. So, that's a good thing...or not ? In that time it was a great thing....now, we want to be able to the same things but with a simplified interface. That's the most recent evolution: thumb navigation aka "getting rid of the pen and allowing us to use our fingers" :)
I think what drove this evolution was the merge between the phone and the pda. We don't mind using a pen with a PDA. We are used to it. But we are not used to need a pen to use a phone. That's just to much work. I need to access my phone fast and hassle-free. Sometimes I only have one free hand to answer the phone or make a call, it must be able to do it.
Besides all the eye-candy, I think that was the great evolution that iPhone popularized. For me, I don't own an iPhone, but I own a windows mobile 5 device. And this platform hasn't seen any user interface improvements in years. It's almost the same I used several years ago, when I first saw a Pocket PC. So, I have to resort to third-party applications to make my user experience more pleasant.
Pointui is a free application that substitutes wm5 desktop with a thumb-navigation interface with a lot of eye candy, resembling iPhone in many aspects (like scrolling a list). It's not completely finished yet (it's a beta) but for me it's better than using wm5 desktop. You can see a video of what it feels like.
The veredict is: I haven't found many bugs, but there are some things lacking to completely substitute wm 5 interface in day-to-day tasks. Besides that, it's great and fun to use. Did I mention it's free ? :) Go ahead and try it.
Friday, January 11, 2008
Gentoo and Ubuntu 7.10
A few days ago I installed Ubuntu 7.10. I needed something quickly and went for Ubuntu. As you might know I used Gentoo for several years, so this is quite a change...although I haven't decided if I'll keep Ubuntu or install something else.
Before this, I was curious about Sabayon and I'll probably try it in a near future. For those who don't know, Sabayon is a flavor of Gentoo. Sabayon has a nice install with many packages pre-built, so that you don't have to compile everything.
From this, you might think I got tired of compiling software. Maybe a little, but that was never the reason why I liked Gentoo. I liked Gentoo because of portage, the huge repository, good community-based documentation for almost anything, and even having to hack one thing or another... what I don't like is having to wait so much time to install a package.
In comparison to Ubuntu, obviously I can install things much faster but I lose the flexibility that portage gave me. Not the "flexibility" of compiling the software...the flexibility of choosing which version of the software I want (not just the last) and which features I want compiled in the software. In the past I had issues with this in Ubuntu: a package was available in repository, but compiled without a feature I needed. Another thing (related to this) I dislike in Ubuntu is the huge dependency tree that I have to pull when installing some packages. In part that's because in Gentoo I always had several features disabled (like Kde or Gnome integrations if I don't use them) which saved me from installing a bunch of libraries that I didn't need.
Well, for now I'll be using Ubuntu and I already found several good/bad points.
- Something I liked was the hardware detection. It detected everything pretty well, including my webcam, printer and digital camera.
- Boot time is fast, even gnome starts fast (but I prefer Xfce, which is always faster)
- Good collection of GUIs for system configuration.
- login window sometimes is just a white screen
- my wallpaper disappears sometimes, leaving me with a pale blue background
- flash plugin for firefox is broken (fixed it manually)
- some strange effect in window titles that disappear, under Gnome
- Compiz Fusion is not available under XFCE (I'll try to fix this by hand when I have the time)
Wednesday, June 20, 2007
New ideas
Here's an interesting article about innovation and how we can feed our brain to breed new ideas. I still haven't tried the techniques, but they seem logical enough to work.
And no, the reason this blog hasn't been updated has nothing to do with lack of ideas, but with time-consuming personal matters :)
Cheers.
Friday, March 09, 2007
Groovy impressions
Recently I've been doing some work in Groovy. First impressions were very good, but I won't start using it for everything. I think we should choose the right tool for the job, and dynamic languages like Groovy are great for some tasks, but are worse for others (and this is not an universal truth, some people function better in one way, others in other ways). I already knew some of the syntax-sugar (map handling, duck typing, etc) that is also common to Ruby or Python, so that was not a surprise. What really impressed me in Groovy was how complex tasks were made simple. For example: file handling. Here's a small example:
File f = new File("a.txt")
f << "text"
This is enough to append the given string to the file. This is different from just "typing less characters". What's great here is that resource handling is completely hidden from the programmer. We just tell it to read or write something to a file and don't have to remember to open or close resources.
Another gem:
URL url = new URL("http://www.google.com")
File f = new File("a.txt")
f << url.text
Yes, it does what you're thinking: automatically fetches the contents of the given URL and stores it in the file a.txt. Now, if you add closures and regular expressions, page scraping just became a lot easier :)
Some people see less typing as the ultimate goal to increase productivity. Although I think much faster than I type, I still spend more time thinking than typing. In the end I don't think a few less characters make any difference. I think is much more a psychological effect than anything else (I constantly see people writing how Ruby makes programming more "fun"). But this kind of simplicity shown above does make a difference because it make file handling easier and less error-prone (no more forgetting to close resources).
Sunday, March 04, 2007
Fixing dead pixels
Fortunately I don't have any dead pixels on my monitor. If I had I could try this, although I'm highly skeptical about it's effectiveness. But, who knows...every now an then I get surprised at something :)
Thursday, March 01, 2007
Java 6 Hotpatching
I already knew of the ability for a java process to attach itself to another process for monitoring purposes. Java 6 comes with a nice graphical console (jconsole) that shows several statistics about the running process, like memory usage, etc.
What I didn't know was that the API for that allowed to do this. According to the article we can substitute a class for a new version of it in runtime!! Although I don't see this sort of thing being used very often, it's still very cool :)
Tuesday, January 16, 2007
Linux needs better IM clients!
There are several IM clients for Linux and they work perfectly fine for the most basic operations, like chatting and file transfer. I personally like Gaim. It's fast, stable and the simplicity of its interface is great. I've also tried aMsn, Kopete, Mercury with different degrees of success. But looking at their Window's counterparts (msn messenger, yahoo, icq, etc) they seem a lot behind. Modern IMs now have things like animated smileys, custom and animated backgrounds, small flash animations and most importantly web cam and audio support.
Well, some Linux IMs actually have most of those features, being aMsn probably the most complete and Gaim the least (doesn't have any of those features). My experience with aMsn (and others) is not very successful. aMsn seemed very unstable, and although I could get my webcam to work, it was too sluggish to be of any use. Mercury seemed promising but I couldn't get the web cam to work. And audio definitely didn't work with neither. I know some have had success with these programs, but they definitely need much work to be able to compete with windows IMs.
Many Linux users don't care about these features, but they are important. And if we want Desktop Linux to be successful they're essential, like it or not. Audio and video is the most important. And is not only to keep up with your friends. Many companies use this for internal video-conferencing (it's simpler, cheaper and perfectly suffices in many cases).
But custom backgrounds, flash animations, etc are also important. And I know many people dislike these features and don't understand why it matters. For several years I lived without those flash animations and fancy backgrounds, but now I'm rebooting to others OSes to use them. Before you think I got some sort of mental illness, let me give you a simple compelling reason why it matters: when you want to talk to a girl (non-techie) that happens to like those features, the whole argument "I use Gaim because I have Linux and it's open source and it's better..." just doesn't seem that strong all of a sudden. And you'll end up using (or letting her use) those features. It's just an example :) but it shows that there are people that want to use those features.
The point is: in many aspects Linux is leading the innovation (take XGL for example), but IM is several years behind. I know it's a very difficult subject because most of the communication protocols are not public. But I'm not sure this lack of good and innovating IMs has all to do with hidden protocols. I just think there's not enough motivation in Linux programmers for implementing it. Maybe distros like Redhat, Suse or Ubuntu start seeing this and start promoting more advances in this area.
Sunday, December 17, 2006
Java XML Data Binding
I don't recall the last time I used DOM or SAX to process XML documents. Things have evolved and processing XML seems more and more an unnecessary low-level work. And that's because of the superior XML Data Binding. XML Data Binding is all about mapping objects to XML, and back again. It's not document-centric, but data-centric. After all, isn't XML a way of representing data ? The data is what matters. For example, if I have the following XML:
<customer>
<code>128682</code>
<name>Mr. Smith</name>
</customer>
I don't want to mess around with DOM objects or SAX events if I just need to get the data into Java objects. I just want to pass the XML to some method and get a Java Bean named Customer with properties code and name (of course the real advantage it's when the XML is a lot more complex). That's what XML Data Binding frameworks are for. And there's plenty of them. They typically work by generating classes from some provided XML Schema. You'll normally get the Java Beans that hold the data and a few classes responsible for the marshalling/unmarshalling of XML.
I tried a few when developing Axis2 web services. Axis2 supports several data binding frameworks and also provides its own implementation: ADB. For what I've seen of ADB I don't like it much. By default it generates Java Beans with strange looking code filled with inner classes. And although it claims that it can generate plain POJOs, I couldn't get them to work well.
There's also XMLBeans, which is known to be the most comprehensive out there. But it also generates a lot of classes and a lot of code.
A third, and in my opinion a better choice, is JibX. With JibX you can do things a little differently. Instead of letting JibX generate the Java Beans, I prefer to create them myself and then configure a short XML mapping file (in the simplest case, just to tell which class is the root element). JibX then generates only 2-3 classes responsible for mapping the XML to the Java Beans. This way my Java Beans stay clean and I have much less generated code.
But there are even more alternatives. I've used XStream several times and it's great. It doesn't generate any code, it's all done in runtime. It's similar to JibX : you also have to provide the Java Beans and some mapping configuration, but instead of generating code to handle the XML, it does everything in runtime. You could argue that XStream is better that JibX, but actually XStream also has some disadvantages: doesn't support XML namespaces, which is essential for web services development. I've used it in other situations, when I don't need XML namespaces.
I should also mention JAXB, Sun's official specification for XML Data Binding (JAXB is only a specification, but there's at least two implementations: JAXB RI and JaxME), but I still haven't tried it.
Thursday, November 16, 2006
Thread pooling in Java 5
Only recently I began learning (and working with) Java 5's new features. When they announced the most emblematic new features there was an immediate reaction to them. Some people loved the new features, others hated it. To be true, I liked some things (like enumerations and generics) but never cared much about auto-boxing or varargs. However, I'm not going to talk about these popular features that almost everyone has heard of. It seems there's plenty more to know about Java 5. Today I discovered a fantastic (first impressions) and powerful new feature: java.util.concurrent package. This package provides several utilities to handle threads and locks, simplifying the whole process. For now I only tried one thing: creating thread pools. This is something that comes in handy very often and it's now very simple to do.
First, you have the usual Runnable that implements whatever you want the thread to do. In this simple example it just prints "hello".
public class WorkerThread implements Runnable {
public void run() {
System.out.println("hello");
}
}
Now, let's imagine I want to create 300 WorkerThread s, but allowing only 5 threads simultaneously. So, a thread pool has to be created with size 5. Then I submit the 300 tasks to the pool, which only executes 5 at a time, queuing the rest. As one thread finishes, the next one in the queue starts.
ExecutorService pool = Executors.newFixedThreadPool(5);
for(int i=0; i<300; i++){
pool.submit(new WorkerThread());
}
As simple as that. But there's a lot more to learn. For example, there's a new interface Callable, that you can use instead of Runnable. Callable allows the executing method to return arbitrary Objects and throw exceptions, something you couldn't do with Runnable's run() method. Look here for more information about this and other cool enhancements that didn't get the overall attention they deserve.
Tuesday, November 07, 2006
LCD Monitor guide
A few days ago my LCD monitor started making some strange noises. I knew that it wouldn't last long. It still works, but barely (the image goes black from time to time). Although I might try to get it fixed, I immediately started searching for a new one. Normally I like to learn about all the little differences and rate the available choices. However, there's just one little problem : there's hundreds of different monitors available!! No I didn't quit, but instead of collecting all the information I started eliminating most of them based on a few characteristics. So here's what I found out.
First and most important of all: the budget. Because I didn't want to buy a very expensive monitor, I think I reduced my choices to half :D
Then there's the size. I own a 17" and I probably will buy another 17", but I still haven't discarded completely an 19" monitor. I won't get a better resolution with a 19", it's the same 1280x1024 that I get with a 17". Everything just looks a little bit bigger (mostly the fonts). I think it's not better or worse. It depends on the distance to the monitor. I read somewhere the distance from the eyes to the monitor should be twice the diagonal. Basically it comes down to which you feel more comfortable with. And I feel comfortable with the 17". Don't think it's the price...I surprisingly found a few 19" monitors cheaper than most 17". There's also the widescreen's, which may be better for some situations, like watching DVD's. But I'm still not sure about those...never worked with one so far.
At this point, the rules I outlined already exclude lots of monitors, but still there's too many and I'm lazy enough to just pick a few well-known brands instead of going trough all of them! Also, it helps if the monitor you want is available at your local stores (unless you want to buy online) and that should reduce a few more.
LCD Monitors have a lot of specifications, but not everything is important:
- Contrast ratio - The contrast ratio is an important factor, but different vendors calculate this value in a different way. So, it's not a reliable parameter. There are lots of 500:1, 700:1...and then there are a few 2000:1. Clearly this last one was measured differently. I just established a minimum here (500:1) and then pushed this parameter to the bottom of the decision-making process.
- Interface - every monitor has an analog input, and some have the better digital input (DVI). I'm not sure how much better is the digital input (some report better image quality, due to faster image render, because there's no analog/digital conversion) but I chose only monitors with DVI input.
- Brightness - this is almost everytime 250 cd/m2, so I completely ignored this.
- viewing angle - most monitors have 160/160, so I excluded everything worse than that. I like to be able to look at the monitor from different angles, and lower values don't allow good viewing.
- response time - I've seen values from 2ms to 12ms. From what I've found 12ms is very good and anything below that isn't that noticeable. Of course there are monitors with response times greater than 12ms (which I ruled out) but most of the new ones already have at least 12ms.
- dot pitch - this almost never varies, so I ignored this.
- power consumption - you could just ignore this, but if you care about the environment (and your bills) you could choose a monitor with better consumption. The best I saw was 30W when "on" and 1W in "standby".
- base - it's important to have the monitor well positioned, so the more options it gives you, the better. Possible adjustments are: rotate, tilt, height adjustment and pivot. Almost every monitor has tilt adjustment. Rotate it's also very common. Height adjustment is the most important to me, but it's not so common. Pivot is becoming more common, but I have no use for it.
- extras - some monitor have lot of extras, like speakers, USB ports, etc. I prefer not having to pay for these useless extras.
You might be asking: "so which one did you choose ?" :) Well, I have a small list but haven't made up my mind yet. And I also don't want to spoil you this wonderful experience of choosing monitors :P
Wednesday, October 18, 2006
Another simple and small
A while ago I reviewed HSQLDB, a small and fast java database. I've used it a few times quite successfully. I'm now developing a java desktop application and I needed a small in-memory database and the first choice was obviously HSQLDB. However, I decided to do some investigation to see what else is there. And surprisingly I stumbled upon HSQLDB apparent successor: H2. Quoted from their website:
The development of H2 was started in May 2004, but it was first published on December 14th 2005. The author of H2, Thomas Mueller, is also the original developer of Hypersonic SQL. In 2001, he joined PointBase Inc. where he created PointBase Micro. At that point, he had to discontinue Hypersonic SQL, but then the HSQLDB Group was formed to continued to work on the Hypersonic SQL codebase. The name H2 stands for Hypersonic 2; however H2 does not share any code with Hypersonic SQL or HSQLDB. H2 is built from scratch.
According to this, it's also faster than HSQLDB, which is always good. And after reading the documentation H2 feels more solid and robust. Maybe, it's because it has a few interesting features like cluster support, encrypted database and recovery tool. Also, data is not stored in text files with SQL inside, as with HSQLDB, which it always felt a bit clumsy to me.
I also like the provided SQL console: it's a web application that you can launch directly from the database jar. It could be more complete, but it works well and I specially like the auto-completion feature.
So, for now, I will be using H2 in my application and see if it's as reliable as it seems.
Sunday, September 17, 2006
Kung-Fu Java
No, I'm not talking about any kung-fu variant from Indonesia. I'm talking about JavaBlackBelt. In their own words:
JavaBlackBelt is a community for Java and related technologies certifications.
It's basically an alternative to Sun's official certifications. And although you won't get the same recognition (not even close), JavaBlackBelt's exams cover a lot more ground than Sun's. You don't get evaluated only for your knowledge of official JDK libraries, but also from your knowledge of many popular technologies that a Java developer might find: Struts, Spring, XML, JDBC, Ant...just to name a few.
Also, the whole certification path is fun and very well thought. When you first register, you receive a white belt and from that you have to pass exams in order to receive the other belts, all the way to the black belt. To take an exam you also have to pay, but in contribution points, not money :) Contribution points are what holds JavaBlackBelt as a community. You receive contribution points by creating questions to a certain exam (it will be voted and filtered before getting into the real exam) or by evaluating other people's questions. Then, you can use your contributions points to take exams and also to participate in auctions. Auctions allow you to spend your contribution points in prizes (normally books or t-shirts).
For now, I have a yellow belt, but I made this post because I just successfully took the JDBC exam and I'm just one exam away from my orange belt. Wheeeeeh...err, I mean: YEAAHHH (with strong manly voice).
Tuesday, August 29, 2006
New breed of menus
Hi. It's been a long time since the last post, but finally I could get some free-blogging-time :)
A few weeks ago I installed Novell's Suse Linux Enterprise 10. I'm not planning on ditching Gentoo any time sooner, but I got curious about some of the innovations the folks at Novell are doing. And I also had a few disk-space to waste.
My main interest was to see in action the new menu interface. Instead of the regular list of applications, this one is quite different. It gives more importance to the favorite and recently used programs (never really cared much about this). To get to the rest of the applications you have a search mechanism. Begin typing the name of the application you want and see the options start converging to that name. Beagle is the technology behind this.
At first glance, it seems more productive, specially for keyboard-intensive users. Just start typing the name and you find the application quite fast. However, we still have to use the mouse: we have to click the button that opens that window (probably there is keyboard shortcut, but I didn't find out) and then click on the application name after you found it. I think I could have found the same application with the same amount of clicks in the old-fashioned menu (most of the times). But definitely there's a trend going on here. Everything in the computer is now searchable, even the applications. And this interface somehow reminds me of the auto-completion features in most programming IDEs: hit ctrl+space and you get a list of options, which you can refine by typing in the first letters. I really would like a menu as productive as this, without the need of any mouse click.
Another use of this kind of search is when we don't know the exact name of the application (just the description)...but that seems of interest only for real newbies. Sooner or later anyone knows the name of all applications.
So after spending some time using it, I really felt it slower, mainly because I had to open a second window after opening the menu. It could be a lot better if it was directly embebbed inside the menu window.
Thursday, May 25, 2006
Oh No! More Ajax
Ok, now things are starting to become ridicule. Ajax this, Ajax that... everything has to be Ajax-enabled to be considered cool. What else can they think of ? An Ajax operating system ? Well...yes :)
Welcome to the wonderful world of AjaxOS, a place where every file is free... I know I'm being a little sarcastic...but just hearing the name sounds a little silly, to say the least. But I don't blame them. At this time, the word sells, so it's a perfect way of getting attention.
Anyway, this new operating system is in fact based on the well known Linspire Linux distribution. What they did was to integrate web (ajax enabled web) applications into the desktop. As cited on their site:
By seamlessly integrating the power of web based applications with Linspire, ajaxOS recognizes any compatible file (doc, svg, odf, txt, xls, etc...) and launches the most up-to-date AJAX software from a Firefox browser.Actually it can be a good idea if it works well. I'm not saying this will be the future, but it's an alternative that has a few advantages, like a small installation or the ability to open any file format without having to install first the corresponding viewer/player. In some cases, this might be useful, like public-access computers as it reduces configuration and maintenance effort drastically, and may even provide a better service (public-access computers don't usually have all the software one might need at some time).
Anyway, it's still not availalable yet. Let's wait and see...
Friday, May 19, 2006
Ajax toolkits
Well, it appears Ajax is here to stay. I've already used some Ajax-like features in some projects but nothing too big yet. The thing that I like less in Ajax is JavaScript....yes, I know that without it, it's not Ajax :)
However, my gripe with JavaScript is not unique...
Recently I discovered two new frameworks for dealing with Ajax and both of them minimize JavaScript code to zero! At least, from the developers point of view. These are Google Web Toolkit and ZK. Both of them are Java frameworks, and they hide all the JavaScript away from the developer. The idea is to create an event-driven framework filled with rich GUI widgets that resemble desktop apps. The thing I liked in both frameworks is how Ajax is put inside the framework and away from the developer. It's not a framework to build Ajax, it's a framework to build rich web interfaces, that is powered on the inside by Ajax. How is this done ?
In these kind of applications there's three parts: client-side code, server-side code and the glue to bind them. Client-side code has to be JavaScript (along with html, CSS, etc) for the browser to interpret. Server-side code can be in almost any language, but in this case, it's Java. The third and the hardest to accomplish is the communication between the previous two. Ajax provided us with a way to create that communication, but coding it by hand is complicated and troublesome (although interesting to learn). What both frameworks provides us to simplify these tasks are:
First, they provide a way of building the web interface that is more similar to the way we build desktop apps. They both provide a rich set of graphical widgets, typical of desktop apps (of course, inside, this is translated into something the browser understands ;)). Second, and more importantly, all communication to the server-side is done automatically and transparently. From the point of view of interface design, you layout widgets that have events. From the point of view of the server, you have a simple Java API that handles events. Everything in between is given by the frameworks and executed automatically. This way, server code is like a traditional desktop app: event-driven.
Despite its similarities, GWT and ZK implement this idea quite differently. In GWT you do everything in Java, even specifying the layout and widgets. But instead of compiling your java files with good old javac, you use a compiler provided by GWT. This compiler will translate the java code you happily wrote into JavaScript. This is basically creating JavaScript without actually writing it :)
In ZK you create the widgets and the layout in XUL and XHTML (actually ZUML, which in the future may also support XAML and XQuery). Then create the event handlers in Java to do the server work. Binding Widgets and event handlers is done in ZUML, and it's easy to do.
I still don't have an opinion of which I like more. At first sight GWT seems easier to work with, but I'm always suspicious of generated code. ZK seems more flexible and it's based on several interesting technologies, like XUL. I believe both can have success, but I'm pretty sure the name google will show its weight, at least at first ;)