Monday, November 28, 2005

Javascript compressor

Let's see if you are a real hacker.

Your problem: a web page somewhat slow, with lots of javascript code.

You can:


  1. ignore the problem
  2. activate mod_deflate in the server for javascript code (be careful with old browsers!)
  3. use a javascript compressor to remove any extra spaces, new lines, comments, etc.
  4. take an existing javascript parser, and make it rewrite your javascript code, as above, without comments, spaces, etc. safely renaming internal variable names.
  5. download the ECMA standard, build a full javascript parser from scratch, and make it rewrite your javascript code as above. Extra points if you implement some extra size optimizations.


After a full week-end working on this thing, I had a parser "almost" working. Some more evenings and I had a compressor, but without renaming variables (yet).

Now I'm trying to finish the parser. As always, the latest 5% takes 90% of time. My parser is compliant except for:


  • Virtual semicolons
  • Regular expressions


We all know that in javascript you have to separate statements with semicolons, but you can ignore them in some cases. Among others, you can ignore them if put separate the current statement from the next one with at least a new line, and these two statements combined as one would raise an error. And actually somebody thought *that* thing would make javascript easier to understand (?!)

I have only modified my grammar so it is able to add virtual semicolons before '}' and before the end of file. These are the two most useful points where you can unambiguosly do not put the semicolon.

Regular expressions are also a bit hard to parse with a LALR(1) grammar. I'm thinking of matching a '/' or '/=' token for a primary expression, and then switching the lexer so that it can parse the regular expression and parse it, all in the action of these two tokens. (At least that is what Rhino does.)

If I fail, I will rewrite the parser to a LL(1) one. I will have the same problems, but this time the parser will be hand made, and thus I should be able to put inside these hacks as I need them.

The good news are that my code is fully parsed and written back correctly, except for the two regular expressions I use. I will then start working on most advanced compression features, not yet available anywhere else.

I will keep you posted!

User areas of photos

He asked, I deliver.

Do you have a ton of photos that you want to show in a map? Worried your pictures will soon fade away from the home page of panoramio to some hard to find page?

Now you can restrict the photos in panoramio to those of a particular user.

See for instance, the pictures taken by Eduardo. Or see them in a single page if the map thing does not suit you.

If you found hard to follow the comments people dropped on your photos, you only have to check now your Panoramio page (sign in and then click on your name in the home page). There you will see your latest pictures and the comments people have done to your photos.

Enjoy!

Friday, November 25, 2005

Web 2.0 definition

Best definition of Web 2.0 to date, directly from The Devil's Dictionary:


Web 2.0, proper noun



The name given to the social and technical sophistication and maturity that mark the— Oh, screw it. Money! Money money money! Money! The money’s back! Ha ha! Money!

Thursday, November 17, 2005

Direct Manipulation

In Panoramio you can move your photos dragging its red pin over the map. That is direct manipulation.

Imagine the old system: first, select "move photo", second, introduce a new location, and third, save the new location. Thus every feature needs explanations, controls and several steps. Adding features make the interaction more complex, slower and difficult.

In the old system, the easy way of keeping interfaces usable is reducing the number of features. However while "less is more" is a nice slogan, doesn't really mean good interaction design. That is the point of Donald Norman in The truth about Google's so-called "simplicity". Norman thinks Google is simple just because they hid everything else, but the search box. He believes "simplicity" doesn't mean good interface. It looks simple, because it has few links. However there are a lot of different contents together under them, so these links aren't clear and lack information scent . For example, where is Google Scholar, under "Advanced Search" or "More"?. (more examples in Norman's article).

It's true there are websites with too many unnecessary buttons and options. But it's also true there are too many useful elements that you just can't hide. The easy way isn't enough here.

Direct manipulation can solve some of this problems. Direct manipulation is nothing new, but came to the web with Ajax. In Flickr or Basecamp you just need to click over a title to edit it. Since you don't edit titles very often, you don't need a permanent "Edit" link beside every title, using space and crowding the interface. Also direct manipulation is better than hiding the "edit" feature under a vague general "settings" link.

Direct manipulation is not easy to design, there are still no standards to follow. You can't use direct manipulation for everything, but it can be very useful sometimes.

Tuesday, November 15, 2005

New page to see the photos

I have done a new page to see the images in Panoramio. There were several problems with the previous one, and also some advantages.

It was not really a new page, but just a bit (a big "bit") of javascript that showed and hide some parts of the main page. It was very fast to lurk the latests photos posted in Panoramio, and each time you changed the photo, only the essential of the web page changed. Switching between the photo and the map view was also very fast, as it was only a "show this, hide that" operation.

This was its advantage, and it was also its major problem.

The code was hard to grow, and the layout of the page severely "inspired" by the layout of the main page. The page was not stored in the navigator history, the back button to go back to the map did not worked, and linking to the page was as hard as linking to a particular place in the map. Some of these things were fixable, but the "hard to grow" and the layout problem were the two real, big problems that I wanted to fix and was hard to do with the current code.

When mkling asked for comments on the pictures, a feature that we had already discussed and pushed back in the TODO list due to these problems, I decided to redo the whole page, and split it from the main map.

The relayout let me show all the relevant info in the same page, without the need to scroll to see the contextual maps, to have comments on images, and to be able to navigate through the photos of the user that posted the image and through the photos in this region of the planet.

The downside is that now is slower to go back to the map. I'm putting Panoramio on a javascript diet, and I will try to pull all the needed data in the minimum number of queries possible, to speed up the "show the map" operation.

I'm also playing with the implementation of some features suggestted by Eduardo, as showing the name of the city where the photo was taken. I hope to have something to show you in the next days.

Wednesday, November 9, 2005

Friends

I knew our friends Juan and Estela, were visiting JoaquĆ­n this weekend in Paris, but I couldn't expect them to post their pictures rowing in the lake of Versailles Palace in Panoramio.com. Cool!

Thursday, November 3, 2005

US cities

I have finally merged the USGS data about the US cities in our search tables.
So now you can jump to any place in the world but those in the Antarctica (they are in yet another database).

Enjoy!