Monday, December 08, 2008

Followed the Waves - Melissa Auf Der Maur

You probably know Melissa Auf der Maur as the bassist of the band Hole. She was the cute skinny redhead. She also toured with Smashing Pumpkins for a while. I don't remember where I've heard of her, but I got her album and listened to it and I really enjoyed it. Here is a taste of her music from her (so far) only album Auf der Maur.

About a second album, I am quoting Wikipedia:In a 2007 interview, Auf der Maur announced that she had finished her second solo album which would go hand in hand with a graphic novel and a concept film, the release dates of which are unclear. The album will be released under the name of MAdM, whereas the comic and film will go by Out of Our Minds, or OOOM for short. A website containing teasers of the projects, as well as a movie trailer, was launched in August, 2007 and can be found at


Friday, December 05, 2008

Microsoft ReportViewer and the ' Session expired' error

Well, I was trying to use for the first time the Microsoft ReportViewer control. I used some guy's code to translate a DataSet XML to RDLC and ran the app. Everything went OK except for the export to Excel (which was the only real reason why I would attempt using this control). Whenever I tried exporting the report, the ' Session expired' error would pop up.

Googling I found these possible solutions:
  • set the ReportViewer AsyncRendering property to false - doesn't work
  • use the IP address instead of the server name, because the ReportViewer has issues with the underline character - doesn't work, albeit, I didn't have any underline characters in my server name to begin with
  • set the maximum workers from 2 to 1 in the web.config (not tried, sounds dumb)
  • setting cookieless to true in the web.config sessionState element - it horribly changed my URL, and it worked, but I would never use that
  • setting ProcessingMode to local - that seemed to work, but then it stopped working, although I am not using the ReportViewer with a Reporting Services server
  • Because towards the end I've noticed that the problem was not an expiration, but more of a Session communication problem, I tried setting machineKey in web.config, although it doesn't work for the InProc setting. So it didn't work either.

For a few days, this blog post showed the last solution as working. Then it failed. I don't know why. I fiddled with the RDLC file a little (arranging columns and colors and stuff) and then it seemed to work again. I have no idea why.

I got mad and used Reflector to get the source code for the ReportViewer control and see where it all happends and why! I have found the following:
  • the error message looks like this:
    ASP.NET session has expired
    Stack Trace:
    [AspNetSessionExpiredException: ASP.NET session has expired]
    Microsoft.Reporting.WebForms.ReportDataOperation..ctor() +556
    Microsoft.Reporting.WebForms.HttpHandler.GetHandler(String operationType) +242
    Microsoft.Reporting.WebForms.HttpHandler.ProcessRequest(HttpContext context) +56
    System.Web.CallHandlerExecutionStep.System.Web.HttpApplication.IExecutionStep.Execute() +181
    System.Web.HttpApplication.ExecuteStep(IExecutionStep step, Boolean& completedSynchronously) +75
  • the error occurs in the constructor of ReportDataOperation:
    public ReportDataOperation():
    this.m_instanceID =
    HandlerOperation.GetAndEnsureParam(requestParameters, "ControlID");
    this.m_reportHierarchy =
    (ReportHierarchy) HttpContext.Current.Session[this.m_instanceID];
    if (this.m_reportHierarchy == null)
    throw new AspNetSessionExpiredException();
  • the Session object that makes the error be thrown is set in the SaveViewState() override method in ReportViewer
  • Apparently, the error occurs only sometimes (probably after the HttpApplication was restarted and during the debug mode of Visual Studio).

This reminded me of the time when I was working on a Flash file upload control and I used a HttpHandler to get the flash file from the assembly. Back then the control would not work with FireFox and some other browsers which would use different sessions for the web application and for getting the Flash from the axd http handler.

This time it works with FireFox and IE, but it fails in debug mode and only in IE. I am using IE8, btw.

My conclusion is that
  1. the ReportViewer control was poorly designed
  2. the ASP.Net Session expired error is misdirecting the developer, since it is not an expiration problem
  3. the actual problem lies in the inability of the ReportViewer control to communicate with the HttpHandler.
  4. The problem also could be related to the browser using separate threads to get the application and access the HttpHandler.

Sunday, November 30, 2008

New Blog Format!

Hi, I am working on a new blog format. As I am lazy and a complete html and CSS noob, it will take a while. Please, feel free to comment on the new look. Actually, feel obligated to do so! :)

Windows Azure

I went to this presentation of a new Microsoft concept called Windows Azure. Well, it is not so much as a new concept, more like them entering the distributed computing competition. Like Google, IBM and - most notably - Amazon before it, Microsoft is using large computer centers to provide storage and computing as services. So, instead of having to worry about buying a zillion computers for your web farm, manage the failed components, the backups, etc. you just rent the storage and computing and create an application using the Windows Azure SDK.

As explained to me, it makes sense to use a large quantity of computers, especially structured for this cloud task, having centralized cooling, automated update, backup and recovery management, etc, rather than buying your own components. More than that. since the computers run the tasks of all customers, there is a more efficient use of CPU time and storage/memory use.

You may pay some extra money for the service, but it will closely follow the curve of your needs, rather than the ragged staircase that is usually a self managed system. You see, you would have to buy all the hardware resources for the maximum amount of use you expect from your application. Instead, with Azure, you just rent what you need and, more importantly, you can unrent when the usage goes down a bit. What you need to understand is that Azure is not a hosting service, nor a web farm proxy. It is what they call cloud computing, the separation of software from hardware.

Ok, ok, what does one install and how does one code against Azure? There are some SDKs. Included is a mock host for one machine and all the tools needed to build an application that can use the Azure model.

What is the Azure model? You have your resources split into storage and computing workers. You cannot access the storage directly and you have no visual interface for the computing workers. All the interaction between them or with you is done via REST requests, http in other words. You don't connect to SQL Servers, you use SQL Services, and so on and so on.

The graphical representation of the Azure vision

Moving a normal application to Azure may prove difficult, but I guess they will work something out. As with any new technology, people will find novell problems and the solutions for them.

I am myself unsure of what is the minimum size of an application where it becomes more effective to use Azure rather than your own servers, but I have to say that I like at least the model of software development. One can think of the SDK model (again, using only REST requests to communicate with the Azure host) as applicable to any service that would implement the Azure protocols. I can imagine building an application that would take a number of computers in one's home, office, datacenter, etc and transforming them into an Azure host clone. It wouldn't be the same, but assuming that many of your applications are working on Azure or something similar, maybe even a bit of the operating system, why not, then one can finally use all those computers gathering dust while sitting with 5% CPU usage or not even turned on to do real work. I certainly look forward to that! Or, that would be really cool, a peer to peer cloud computing network, free to use by anyone entering the cloud.

Saturday, November 22, 2008

Dexter in the Dark - Jeff Lindsay

book cover Well, I just said I can't wait for the third book, haven't I? :) Anyway, Dexter in the Dark was a bit of a disapointment to me. Apparently, Dexter's inner demons are just that, demons, liked to some ancient deity from the times of Solomon called Moloch which is like an alien parasite thing. Really... What did Lindsay do? Read Snow Crash? Watch Fallen? Try to mix Stargate Goa'ulds with Wicker Man and Eyes Wide Shut? Geez!

When I was getting so comfortable with the character of Dexter, thinking that Jeff Lindsay was a genius for portraying a type of character I was always thinking of writing, he just takes all that inner maniacal urge that both empowered and limited the character and transforms it into an external, fantasy like thing. Bad writer, bad!

Anyway, that doesn't mean I didn't enjoy the book. I just think that when the third series of the TV show became too far fetched, they were still safe when compared to it. I mean, until now Dexter was a brilliant guy with a dark path and also with a sort of artificial morality, mix in some police stuff, some blood spatter, the weird police seargent sister. It was a perfect setting for introspection and solitary struggle. I loved that! And now demons? As Doakes would have put it "the hell for?".

The fourth Dexter book is supposedly due for february 5th 2009. I hope Lindsay abandons the weird supernatural crap and instead focuses on Dexter's training of his adoptive children into the art of killing. Otherwise I can only see it turn toward so many bad directions like Blade or Hellboy or other green "hybrid saves the planet" thing.

Dearly Devoted Dexter - Jeff Lindsay

book coverDearly Devoted Dexter is much darker than the first Dexter book. Maybe it is just because all the facts about Dexter are clear and it starts with a gruesome murder, insane special forces style. The title comes from the fact that he helps his sister, now partially in the loop about his Dark Passenger, to solve the newest serial killer case. Of course, Deb, now a seargent after Laguerta has died, has a personal stake in this, since one of the people the murderer abducted and intends to do bad things to, is her boyfriend, with whom she is very much in love.

It is interesting to develop the Dexter character in this way, especially since he is described as totally indifferent to the horrible fate of people he doesn't care about, yet he is still compelled to help his sister out.

I was a bit disapointed by the police work involved. If I were to believe Lindsay, the Miami police are a bunch of morons, following (badly) a set of procedures without any real talent other than badmouthing.

Elements from this second book in the Dexter series were clearly used in the series, but it is already a completely different story. The FBI agent that Deborah briefly dates in the series was inspired by the character of Kyle, shady government agent that she falls in love with in this book. The stalking of Dexter by the grumpy seargent Doakes is also mirrored from this book, although the motives and the outcome are completely different.

Again, the series evolves the Dexter character more and the story is more complex than the book, but by now it is obvious the TV show and the book are going in completely different directions.

All in all, a bit better than the first, darker, but also funnier. I have never laughed as much reading a book for a long time. Can't wait for the third book now.

Friday, November 21, 2008

StringBuilder substring?

Well, it may have been obvious for many, but I had no idea. Whenever I was writing a piece of code that uses a StringBuilder I was terribly upset by the lack of a substring method.

It was there all along, only it is the ToString method. You give it the startIndex and length and you're set.

Looking at the source, I see that in .Net 1.1 this method is only a normal Substring on the internal string used by the string builder. In Net 2.0, the method uses a InternalSubStringWithChecks internal method of the string class, which is using a InternalSubString unsafe method that seems to be more basic and thus faster.

Darkly Dreaming Dexter - Jeff Lindsay

Book cover, obviously printed after the TV series startedI have been watching this TV Series called Dexter and slowly but surely I fell in love with it. It features a psychopathic serial killer that has a hobby of killing other killers. The story is long and I suggest you watch it to get it fully. Anyway, the series has reached season 3 and stars Michael C. Hall, which you may recognize from the Six Feet Under TV series. I've also noticed that the series is based on a book! So, naturally, I got the book and started reading it. It's Michael C. Hall on the cover there.

Darkly Dreaming Dexter is the first in a series of Dexter books by Jeff Lindsay. While it starts pretty much the same as the series, the series quickly moves away from the script in the book. However, the spirit is there, even if, of course, they had to make the lead character a little more likable in the series and the whole thing less bloody.

Imagine an emotionless killer, raised by his cop father to kill according to a code and also to be thorough and attentive to the details so that the police wouldn't catch him. He is also working for the Miami police department as a blood spatter analyst. The inner dialogues are really delicious, the way he sees the world as a cynical dark Data is both funny and deep. Lindsay manages to portray an alien being, silently watching the world we take for granted, hunting on the edge of our own morality.

And while I do enjoy the book, I have to say that the series is more complex and the story a bit more realistic. So, there, finally a movie or series that surpasses the book!

Wednesday, November 19, 2008

Cascading Dropdowns and 'Invalid postback or callback argument' error

Actually, I think this applies to any dynamic modification of drop down list options. Read on!

I have used a CascadingDropDown extender from the AjaxControlToolkit to select regions and provinces based on a web service. It was supposed to be painless and quick. And it was. Until another dev showed me a page giving the horribly obtuse 'Invalid postback or callback argument. Event validation is enabled using <pages enableEventValidation="true"/> in configuration or <%@ Page EnableEventValidation="true" %> in a page. For security purposes, this feature verifies that arguments to postback or callback events originate from the server control that originally rendered them. If the data is valid and expected, use the ClientScriptManager.RegisterForEventValidation method in order to register the postback or callback data for validation.'. As you can see, this little error message basically says there is a problem with a control, but doesn't care to disclose which. There are no events or overridable methods to enable some sort of debug.

Luckily, Visual Studio 2008 has source debug inside the .Net framework itself. Thus I could see that the error is caused by the drop down lists I mentioned above. Google told me that somewhere in the documentation of the CascadingDropDown extender there is a mention on setting enableEventValidation to false. I couldn't find the reference, but of course, I didn't look too hard, because that is simply stupid. Why disable event validation for the entire page because of a control? It seems reasonable that Microsoft left it enabled for a reason. (Not that I accuse them of being reasonable, mind you).

Analysing further, I realised that the error kind of made sense. You see, the dropdownlists were not binded with data that came from a postback. How can one POST a value from a select html element if the select did not have it as an option? It must be a hack. Well, of course it was a hack, since the cascade extender filled the dropdown list with values.

I have tried to find a way to override something, make only those two dropdownlists not have event validation enabled. Couldn't find any way to do that. Instead, I've decided to register all possible values with Page.ClientScript.RegisterForEventValidation. And it worked. What I don't understand is why did this error occur only now, and not in the first two pages I have built and tested. That is still to be determined.

Here is the code

foreach (var region in regions)
new PostBackOptions(ddlRegions,region)

It should be used in a Render override, since the RegisterForEventValidation method only allows its use in the Render stage of the page cycle.

And that is it. Is it ugly to load all possible values in order to validate the input? Yes. But how else could you validate the input? A little more work and a hidden bug that appears when you least expect it, but now even the input from those drop downs is more secure.

My control was used in two pages with EnableEventValidation="false" and that's why it didn't throw any error. Anyway, I don't recommend setting it to false. Use the code above. BUT, if you don't know where the code goes or you don't understand what it does, better use this solution and save us both a lot of grief.

Tuesday, November 18, 2008

Thoughts on Prize Extraction Games

We are working on these projects in which people either receive an email with an invitation or they register on a web site or they simply play and expect an instant result. What is the best way to compute the chance of winning?

Case 1: People register then, at a certain date, an extraction of prizes takes place. Ah, this is the best and simple situation. You have the list of players, the list of prizes. Just do an index = Random(0,number_of_players) and if index is smaller than number_of_prizes, you know the person won a prize. A second random on the number of prizes will determine the prize itself. The win probability is number_of_prizes/number_of_players.

Case 2: People enter a site that allows them to play instantly and win a prize. Here the problem is trickier. While the algorithm is basically the same, prizes over people, you don't have the total number of players. There are some subcases here based on the following parameters:
  • The win campaign lasts for a certain amount of time or is indefinite
  • The prizes must all be given at the end of the campaign or not
  • The players play after they have received an invitation (email, sms, etc) or just randomly coming from ad clicks or for some information on the site

Let's assume that the campaign doesn't last for a finite time. The only solution is to pick a win probability and be done with it until you remain out of prizes. You always compute this by considering the number of people playing over a period of time, in other words the speed of people playing. However, in this case the only thing influencing the selected probability is a psychological one: how many people would need to win in order to have a marketing effect?

Now, if the campaign does have a finite time, you would use the speed of the people playing to determine the total number of people that would play. Let's assume you know people are visiting your site at an average rate of 1000 per hour, then you see how many are playing and you remember this percentage, so you can estimate the number of players per hour, then you just multiply that number to the total number of hours in the campaign. Again, we get to the prizes over people formula.

However, it is very important to know how people are deciding the participate in the extraction!

If it is just a game added to one's site, then the people are coming and going based on that site's popularity and hourly/daily distribution (since the number of visitors fluctuates). So just computing this from the first hour of people coming to the site and playing doesn't help, but it might when using the first day and maybe the first week. One week of statistical data is best to estimate the number of people over time. Then the formula is number_of_prizes_available_per_week/people_visiting_per_week. Where the number of prizes available per week is either the total number of prizes over the finite campaign time or an arbitrary number chosen by the campaign creator.

If, instead, people are being invited to play, as following an email promotion campaign, let's say, then they will come as soon as they read their email. That means they will flock to your site in the first hours, then just trickle in the next week, then nothing. That means that estimating the total number of players from the first hour or day is not really feasible unless you are certain of a statistical distribution of people playing games after email campaigns. It is difficult as different messages and designs and game types might attract more or less people.

A mixed hybrid can also exist, with a game on a site that also people are invited to play over email. Then all the parameters from above must be used. In any case, the best estimation I can think of comes from the total of players in similar campaigns. The more similar the better.

But what if ALL the prizes must be given to people, as required by law or simple common sense (so as not to be seen as keeping some for you or your friends)? Then one can adjust the probability rate to suit the extraction speed. The same prizes over people formula is used, but only on the remaning values. The probability of winning is given by number_of_remaining_prizes/number_of_remaining_people.

But that has some disadvantages. If the number of total participating people is badly estimated it will result into a roller coaster of probabilities. People playing in the first part of the campaign would be either advantaged or disadvantaged than the people in the last part as the total number of players is being adjusted over time to compensate for the first bad estimation.

Let's do a small example:

Day 1Day 2Day 3Day 4Day 5Day 6Day 7
People playing75001000400300200300100
Estimated total players15000250001200011000100001000010000
Estimated remaining players75001650031001800600200100
Remaining prizes (day start)100504742362713
Win probability0.66%0.30%1.52%2.33%6.00%13.50%13.00%

As you can see, the people playing first were screwed pretty much, because it was expected the total players to be 15000 and the distribution closer to linear. After half of them played in the first day, panic made them all increase the expected players to 25000, while thinking what to do. Then they realised that the distribution of players is affected by the fact that all play after reading their emails and then they will probably not come play anymore. They adjust the win probability every day and as you can see, it is good to play in the last days.

But what would have happened if 1) they knew the percentual distribution of players would be 75,10,4,3,2,3,1 after an email campaign and 2) the total number of players will be a percentage out of all emails sent and so they estimated 10000 people playing and the right distribution?

Day 1Day 2Day 3Day 4Day 5Day 6Day 7
People playing75001000400300200300100
Estimated total players10000100001000010000100001000010000
Estimated remaining players250015001100800600300100
Remaining prizes (day start)100251511863
Win probability1.00%1.00%1.00%1.00%1.00%1.00%1.00%

Even if computing every day the number of remaining prizes over the remaining players, the probability was constantly 1%. Of course, one could say "Why didn't they stick to their 0.66% probability and be done with it? Like this:

Day 1Day 2Day 3Day 4Day 5Day 6Day 7
People playing75001000400300200300100
Estimated total playersNot important
Estimated remaining playersNot important
Remaining prizes (day start)100434038373534
Win probability0.66%0.66%0.66%0.66%0.66%0.66%0.66%

Everything is perfectly honest, only that they remained with a third of prices on hand. Now they have to give them to charity and be suspected of doing this on purpose for whatever distant relative that works at that charity.

Well, think about it, and let me know what you think. Are there smarter solutions? Is there a web repository of statistical data for things like that?

Monday, November 17, 2008

Claymore - a pretty cool manga/anime

The story in Claymore was pretty standard: monsters attack people, people are powerless, therefore an organization of hybrids (female warriors carrying deadly claymore swords) emerges to protect people from said monsters. So it's like Blade, in theory. But in reality it has the feel of Berserk (the first cool part, not the crappy lingering mess that it is now). Or you can imagine Naruto, with the monster and everything, fighting against a species of demon foxes. Only without the sillyness and all the mentoring.

I really liked the manga, I can barely wait for it to continue, unfortunately it is distributed like one chapter per month. The 26 episode anime series follows closely the manga story, but unfortunately ends prematurely with a different idea in the last two episodes. Not that it is not a lot better than Berserk leaving us in the dark at the end of the anime or other series that just ended in mid air.

Bottom line, if you liked Berserk, you will like this. If you like Naruto/Bleach, you will like this. I can even throw a little Akira in, to convince you, but it would probably be a stretch :)

Tuesday, November 11, 2008

How to return anonymous types from .Net 3.5 methods or how to cast objects to anonymous types

I have found this great link in Tomáš Petříček's blog which gives a very simple and elegant solution to casting to an anonymous type.

Short story shorter:

// Cast method - thanks to type inference when calling methods it
// is possible to cast object to type without knowing the type name
T Cast<T>(object obj, T objOfTypeT)
return (T)obj;

So simple! You see that the objOfTypeT parameter is never used, but the type infered from it is!

Correct usage:Cast(obj,anonObj)
Incorrect usage:Cast(obj,anonObj.GetType())

The Temporal Void - Peter F. Hamilton

Book cover

Wee! Another Peter F. Hamilton book has been published. This time it is the second part of the Void trilogy, an ongoing series set up in the Commonwealth saga universe, but much later. Many characters are rented from said saga, so it would be a good idea to read that one first. Besides, as is Hamilton's style, the second book starts abruptly from the end of the first one and ends abruptly awaiting the third part.

And, again, like in the Night's Dawn trilogy, the plot is a combination of stories, one set in the technological future of mankind and one in a feudal, fantasy like, universe. Hamilton's talent is to combine these two in a believable common narative. They are not so linked as in Night's Dawn and, I have to admit, I like the fantasy arch better, even if it is the classic Messiah myth. Maybe because it is not contiguous, but rather made up of small stories that have a beginning and an end.

Well, either way, it was a great book and I am waiting for the third part, due to be released in far away late 2009 or even 2010 :(

Monday, November 10, 2008

Oil is not dead yet!

First of all I want to say that I know I haven't been writing many tech articles lately and I've disappointed quite a few of the people reading this blog. I intend to rectify that, even if I am suferring from one of those dry tech spells at the moment :)

Anyway, about the oil. What if one could replicate the process that creates oil naturally, speed it up, and use it to not only for creating oil, but also for getting rid of a lot of organic waste? The technology is called Thermal Depolymerization and is pattented by the Changing World Technologies company. So, if one is to believe the Wikipedia article, while the test factory and the company itself have had problems ranging from technological hickups, to having to pay for the waste they use as fuel up to neverending complains from neighbours about the smell of bio waste, the technique was shown to work!

So, while the process does allow the production of slightly cheaper oil than the one extracted, it will certainly gain a big boost from the increase of prices in underground oil.

Here is a link to a 2003 interview with the CEO of Changing World Technologies, but utterly demolished by Paul Palmer, a chemistry PhD here. Also, this process is nothing very new or unique! there are other methods that are said to transform organic waste to Ethanol, as described in this link. So, oil may not dead yet.

Thursday, October 30, 2008

Sorting LInQ results in a GridView

I had this really old site that I was asked to "upgrade". Net 1.1 to 3.5. So I had to change old ADO.Net SqlConnection to Linq-to-Sql. A lot of unforseen issues. Sometimes it is close to impossible to recreate a simple SQL query without changing the database, as in the case of updating tables or views without primary keys.

Anyway, I got stuck in a very simple issue: How to sort a Linq query based on the string returned by the GridView Sorting event. After a lot of tries and Googling I found it was easier to use third party software (although it's more of a 2.5rd party software, as it is made by Scott Guthrie from Microsoft). Here is the link to the blog entry: Dynamic LINQ (Part 1: Using the LINQ Dynamic Query Library) . You can find there a sample project to download, with the LINQ Dynamic Library inside.

With this library and the OrderBy extension methods the code becomes:
var somethings=from something in db.Somethings....;
var data = somethings.OrderBy(Sort + " asc");

Friday, October 24, 2008

Globalization and Localization for AjaxControlToolkit controls

I had this calendar extender thingie in my site and the client was Italian so I had to translate it into Italian. Nothing easier. Just enable EnableScriptGlobalization and EnableScriptLocalization and set either the Web.config or Page culture settings.

However, the footer of the calendar kept showing "Today" instead of "Oggi". I checked the source and I noticed that it used a AjaxControlToolkit.Resources.Calendar_Today javascript variable to create the footer. Step by step I narrowed it down to the AjaxControlToolkit resources! Funny thing is that they didn't work. I tried just about everything, including Googling, only to find people having the opposite problem: they would have all these resource dlls created in their Bin directory and couldn't get rid of them. I had none! I copied the directories from the ACTK and nothing happened.

After some research I realized that the Ajax Control Toolkit does NOT include the option for multi language UNLESS in Release mode. I was using everything in the solution, including the ACTK, in Debug mode. Changing the AjaxControlToolkit build to Release in the solution made this work.

Tuesday, October 21, 2008

Reinvention, a bad or a good thing?

A new (and old) buzzword: to reinvent. It is always a good thing to reinvent yourself, they say, with the effect of relieving boredom and living a "new" life. You may discard bad or useless things in favor of good things. It is also good to reinvent something somebody else did, like a movie. You take the idea, you remove the bad things, you add good things. But, as in the case of the benevolent tyrant, the definition of good and bad is always fuzzy.

Was it good to reinvent BattleStar Galactica? I say YES! It was (and still is, despite screenwriters efforts) the best sci-fi series out there. Of course, that is my opinion. Was it good to reinvent Terminator, incarnated into a teenage girl looking machine? Ahem. But I still watch it. Was it good to reinvent Superman as a troubled teenager? Puh-lease! Come... on! Nah-uh! (See, I address the younger demographic here).

Because, you see, the people that decide what is good and bad in movies are actually the money people. They look at superficial statistics that only show... money! They make abhorent remakes of decent films (like Indiana Jones 4 - The Rape of Indiana) or they turn every hero into man/woman/teenager/animated-character/doll versions that bring nothing new.

The original vs. the reinvented crew In the case of Star Trek, they made the first low budget series than achieved cult level regardless of bad production values and some ridiculous scripts, then they made a sequel (at that time reinvention was not invented yet) where Patrick Stewart redefined the space captain as a cerebral science oriented man, but with lots of guts, then they started the old routine: make the captain black, make him a woman, replace the ship with a station, then with another ship, but in some other place, etc. They even made a prequel, which, for almost a full season, was decent in both interpretation and scenarios. What was missing, of course, was a teenage Star Trek captain. Well, no more!

"Star Trek", the 2009 movie in the making (and no doubt, with a series looming if money are made), features a young Kirk and (what a fallacy) a young Spock! The director is none other than my least favourite person in the world: J.J.Abrams, the maker of such abismal stupidities (but well received by the general audience) like Alias, Lost and Fringe. The writers are Abramses old team, Roberto Orci and Alex Kurtzman, the brilliant creators of such idiocies like Alias, Fringe and Xena/Hercules!

I am trying to keep an open mind here, but I would venture to guess that the new Star Trek will have big booming sounds whenever something strange happends, will be filled with inexplicable things that will never be explained, except maybe in the movie (but I doubt it, they have to plant the hook for a series) and will have people calling the others by name obsessively, regardless if the need for it arises. So, it may be cool, but I expect to be baktag!

Monday, October 20, 2008

Path Vol.2 - Apocalyptica feat. Sandra Nasic (Guano Apes)

I've listened to this song for a long time now, it was only proper that it would appear on my blog sooner or later. Sandra Nasic sang for Guano Apes and after the band split she released a solo album in 2007 called The Signal which features some good songs, although a little mellow for my taste. You can listen to fragments of some of Sandra's songs on her MySpace site, visit her official site or just plain google for videos like I do :)

So listen to this symphonic Sandra Nasic sound. I wish she would have done more pieces like this.

Saturday, October 18, 2008

Brisingr by Cristopher Paolini (Eragon 3)

Book cover Brisingr is the third book in the Inheritance cycle (now a cycle because the author could not end the story in only three books). While I enjoyed reading it and I know that Paolini had all the best intentions writing it, I would not recommend it.

I have too little recollection of the first two books, to tell you the truth, but I do remember I was captivated by the action in them, if nothing else. The "magical technology" also had a great lure for me. In the third installment, all of these are missing or of poor quality. Roran is far more interesting than Eragon in this book, while the bad characters have lost a few dimensions (from the few they already had) and have become pathetic. T'Pol (sorry... I meant Arya) is docile and closer to the human heart, making her completely uninteresting, while the elves in general (and Oromir and Glaedr in particular) act like Asgaard on pot.

Why use StarTrek and StarGate terms to describe a fantasy book? Because it seems that's the only real inspiration of the third book of the Inheritance cycle. I could have done without the Doctor Who references in it, as well.

You can see a little YouTube video of Christopher Paolini talking about Brinsgr here, where an "unofficial" fan club is trying to earn money from said YouTube by disabling the embedding option.

Friday, October 10, 2008

AutoCompleteExtender with images (or any kind of template)

A few days ago a coworker asked me about implementing an autocomplete textbox with not only text items, but also images. I thought, how hard can it be? I am sure the guys that made the AutoCompleteExtender in the AjaxControlToolkit thought about it. Yeah, right!

So, I needed to tap into the list showing mechanism of the AutoCompleteExtender, then (maybe) into the item selected mechanism. The AutoCompleteExtender exposes the OnClientShowing and the OnClientItemSelected properties. They expect a function name that accepts a behaviour and an args parameters.

Ok, the extender creates an html element to contain the list completion items or gets one from the property CompletionListElementID (which is obsoleted anyway). It creates a LI element for each item (or a DIV in case of setting CompletionListElementID). So all I had to do was iterate through the childNodes of the container element and change their content.

Then, on item selected, unfortunately the AutoCompleteExtender tries to take the text value with firstChild.nodeValue, which pretty much fails if the first child of the item element is not a text node. So we will tap in OnClientItemSelected, which args object contains item, the text extracted as mentioned above (useless to us), and the object that was passed from the web service that provided the completion list. The last one we need, but keep reading on.

So the display is easy (after you get the hang of the Microsoft patterns). But now you have to return a list of objects, not mere strings, in order to get all the information we need, like the text and the image URL. Here is the piece of code that interprets the values received from the web service:
// Get the text/value for the item
try {
var pair = Sys.Serialization.JavaScriptSerializer.deserialize('(' + completionItems[i] + ')');
if (pair && pair.First) {
// Use the text and value pair returned from the web service
text = pair.First;
value = pair.Second;
} else {
// If the web service only returned a regular string, use it for
// both the text and the value
text = pair;
value = pair;
} catch (ex) {
text = completionItems[i];
value = completionItems[i];

In other words, it first tries to deserialize the string received, then it checks if it is a Pair object (if it has a First property) else it passes the object as value and text! If deserialization fails, the entire original string is considered. Bingo! So on the server side we need to serialize the array of strings we want to send to the client. And we do that by using System.Web.Script.Serialization.JavaScriptSerializer. You will see how it goes into the code.

So far we displayed what we wanted, we sent what we wanted, all we need is to set how we want the completion items to appear. And for that I could have used a simple string property, but I wanted all the goodness of the intellisense in Visual Studio and all the objects I want, without having to Render them manually into strings.

So, the final version of the AutoCompleteExtender with images is this: A class that inherits AutoCompleteExtender, but also INamingContainer. It has a property ItemTemplate of the type ITemplate which will hold the template we want in the item. You also need a web service that will use the JavascriptSerializer to construct the strings returned.

Here is the complete code:

using System;
using System.IO;
using System.Web.UI;
using System.Web.UI.WebControls;
using AjaxControlToolkit;

namespace Siderite.Web.WebControls
/// <summary>
/// AutoCompleteExtender with templating
/// </summary>
public class AdvancedAutoComplete : AutoCompleteExtender, INamingContainer
private ITemplate _template;

public ITemplate ItemTemplate
get { return _template; }
set { _template = value; }

protected override void OnInit(EventArgs e)
const string script = @"
function AdvancedItemDisplay(behaviour,args) {
var template=behaviour.get_element().getAttribute('_template');
//if (!template==null) template='${0}';
for (var i=0; i<behaviour._completionListElement.childNodes.length; i++) {
var item=behaviour._completionListElement.childNodes[i];
var vals = item._value;
var html=template;
for (var c=0; c<vals.length; c++)
html=html.replace(new RegExp('\\$\\{'+c+'\\}','g'),vals[c]);

function AdvancedSetText(behaviour,args) {
var vals=args._value;
var element=behaviour.get_element();
var control = element.control;
if (control && control.set_text)
element.value = vals[0];
ScriptManager.RegisterClientScriptBlock(this, GetType(), "AdvancedAutoComplete", script, true);

OnClientShowing = "AdvancedItemDisplay";
OnClientItemSelected = "AdvancedSetText";

protected override void OnPreRender(EventArgs e)
string template = GetTemplate();
((TextBox)TargetControl).Attributes["_template"] = template;

private string GetTemplate()
Content ph=new Content();
ph.Page = Page;
HtmlTextWriter htw = new HtmlTextWriter(new StringWriter());
return htw.InnerWriter.ToString();


using System.Collections.Generic;
using System.Web.Script.Serialization;
using System.Web.Script.Services;
using System.Web.Services;

/// <summary>
/// Web service to send auto complete items to the AdvancedAutoComplete extender
/// </summary>
[WebService(Namespace = "")]
[WebServiceBinding(ConformsTo = WsiProfiles.BasicProfile1_1)]
public class MainService : WebService
public string[] GetCompletionList(string prefixText, int count)
JavaScriptSerializer jss=new JavaScriptSerializer();
List<string> list=new List<string>();
for (int c = 0; (list.Count< count) && (c < 100000); c++)
string s = (c*c).ToString();
if (s.StartsWith(prefixText))
object[] item = new object[] {s, "images/demo.gif"};
return list.ToArray();

An example of the use

<asp:TextBox runat="server" ID="tbMain"></asp:TextBox>
<cc1:AdvancedAutoComplete ID="aceMain" runat="server" BehaviorID="mhMain" TargetControlID="tbMain"
ServiceMethod="GetCompletionList" ServicePath="~/MainService.asmx" MinimumPrefixLength="0"
<asp:Image runat="server" ID="imgTest" ImageUrl="${1}" /><asp:Label runat="server"
ID="lbTest" Text="${0}"></asp:Label>

That's it! All you have to do is make sure the controls in the template render ${N} text that gets replaced with the first, second, Nth item in the list sent by the web service. The text that will be changed in the textbox is always the first item in the list (${0}).

Restrictions: if you want to use this a control in a library and THEN add some more functionality on the Showing and ItemSelected events, you need to take into account that those are not real events, but javascript functions, and the design of the autocompleteextender only accepts one function name. You could create your own function that also call on the one described here, but that's besides the point of this blog entry.

Thursday, October 09, 2008

Monday, October 06, 2008

Zodiac by Neal Stephenson

Book coverZodiac is an environmental eco-thriller. It's Stephenson's second novel and it started in a similar way to The Big U, which I couldn't really read through. But I had nothing else to read so I kept going until it got funny and good. If you read it, try to get over the bad start, because it is not a bad book at all.

The basic plot is that of a pragmatic environmentalist with a chemistry background working against the waste dumping industry. In the end he uncovers a plot of global implications and, of course, foils it. But the story itself is more important than the ending. This is not one of those books you read in half a day, driven by the need to know how it all turns out, but one of those you read waiting to see what the main character is going to do or say next, while going towards the predictable finish.

Here is a much better commentary than mine, what I can say is that the book was definitely not sci-fi, rather a thrill-fiction, but it was well written. It makes for a good train book.

Monday, September 29, 2008

How to lose weight [Solved]

In this day and age, being or getting fat is treated like a drug addiction, in all sense of the word treated. Fat people are marginalised, socially pushed to find a solution. Many solutions are provided, from pills that help you lose weight to food replacement things and "health food" so that they can slowly get rid of this addiction to food. And in many respects, fat people are behaving like addicts. They have cravings, they find reasons for eating "just a little bite", they show withdrawal symptoms, they try quitting and they fail. There are also "pushers", trying their best to hook you on chips, sugary drinks, fatty meats and fast foods as well as food suplements and slim drinks.

The funny thing is that the actual solution to extra weight is the same as in the case of hard drugs: you want to get rid of heroin addiction, stop taking heroin. There is this slightly annoying fact that you can live without heroin, but you can't without food. But there are people working on it.

Anyway, I started writing this because, as far as I am concerned, I know I have found the solution. And it is not so hard to do it, either. I've started from 116kg, lost 16 in 4 months, started drinking Coca-Cola like crazy while keeping the diet, didn't gain much weight, then started eating a lot of pasty, bready, pizzy and fatty food and gained 10 kg in about 1 month and a bit, then I started a custom (less strict) diet again and got to 104kg in 1 month. In other words, even if I stop the diet, I have to make an effort to gain weight. Even if I slip a little, it doesn't matter. And if I make my own food style, one that avoids too much fat and bread and sugar, I don't gain weight, even if I stop the diet completely. It is way easier to gain weight than to lose it, but that is not the point.

Bit by bit, though.

You have to consider that this is not a weight loss blog and that I am only describing my experience here, but hey, it worked. From what I could gather there are three points that need covering:
  1. Don't gain extra weight
  2. Eat the right stuff to maintain your weight
  3. Keep your metabolism running to lose weight
. Pretty easy.

In order to not gain extra weight, eat less. A normal sedentary human needs about 2000 calories per day. We eat a lot more than that and a lot of it just goes out in the toilet. A small part (but significant to this blog entry) is stored as fat. By eating 1500 calories you get to stop gaining weight and start losing it. But how does one know what 1500 calories mean? Very simple: you get a list of what you are supposed to eat every day at every meal and you don't stray from it.

In order to maintain your weight even after stopping the diet, you need to eat the right stuff. I don't know exactly what that means, but guessing by the food I am allowed to eat and what I am not and what I Googled on the net, I found out that there is a separation of food types into alkaline and acidic foods. Supposedly, you need 75% of the first category and 25% of the latter. A normal Western diet is the other way around, hence the weight gain and the accumulation of fat. Also, the division of foods in these two categories seems to have nothing to do with pH, as lemons are considered highly alkaline. Probably the terms refer to the body's response to them. Also, you need to stay off carbo-hydrates. This is called a "low-carb" diet. Carbo-hydrates is a fancy word for sugars, but there is the good sort (like in fruits and a bit honey) and the bad sort (like refined sugar and refined flour products).

And last, but not least, eat many times a day. That is counter intuitive, but if you think about it, it makes a lot of sense. The human body is very adaptive. If you eat once a day, it will remember to shut off functions that use energy the rest of the day. Not to mention that you'll probably get an ulcer anyway. So instead of losing weight, you will lose energy. Even worse, when you start eating again, the body will just store it, since it is running in energy-saving mode. Instead, if you eat many times a day, your body knows energy is freely available and the metabolic rate will go up instead of down, helping you to burn reserves (fat).

Ok, enough theory. I will give you a general 1500 calory diet to follow every (fucking) day. If you get bored, read the part about the routine. Don't stop. Remember, I lost 16 kilos in 4 months and then I had to make an effort to put 10 kilos back.

So, there are 5 meals a day! breakfast, brunch, lunch, lupper (sorry, couldn't help it :) ), supper. You are allowed to drink water and green tea with no sugar only. Maybe coffee in the morning, but that's frowned upon :D.

Breakfast (7:00-8:00): 100g of low fat meat (like grilled chicken breast, fish, cremwurst (the thing in the hot dog), etc.) and/or low fat cheese (like mozarella or any low fat cheese). 100g in total. That means like 3 (slim) cremwurst sausages. Also a tomato and a cucumber (a normal cucumber, like the ones that are the total size of a tomato :) not the huge ones). You are allowed two slices of bread, not white, but wheat bread.

Brunch (10:00-11:00): 250g of fruit. Any fruit. Or 4-5 dried fruits.

Lunch (13:00-14:00): 150g of low fat meat (preferably grilled, else you start dreaming of warm meals) and a salad (tomato salad, green salad, boiled vegetables whatever. Use olive oil and as little salt as possible. Use lemon juice, not winegar). For example if you want to make yourself a tomato salad, you use 2 tomatos and a cucumber, some olive oil and lemon juice and a little salt. That's the size of a salad, not 4 kilos of tomatoes with mayo salad dressing. Two slices of wheat bread.

Meal between lunch and supper (16:00): 250g of fruit. Any fruit. Or 4-5 dried fruits.

Supper (18:30-19:30): 400g of yoghurt and/or low fat meat or salad. 400g in total, use at least 200g of yoghurt, though, it has enzimes that will help you during the night. Maybe replace the yoghurt with a less fatty soup. One slice of wheat bread.

That is it! You eat nothing else. You drink nothing else than water and grean tea. You avoid sugar as much as possible, onions and carrots and peas are sweet, too. No diet drinks or sugar-free chewing gum, they don't really help. Chocolate is a no no. Icecream is a crime. Alcohol deserves capital punishment. (well at least that's what the nutritionist told me).

My opinion of it? Well, things are not so bad. Remember that you cannot put more weight than what you eat. Even if you live by breathing air and you store everything you ingurgitate, you cannot gain 10 kilos if you only eat 1. Does your body store the Coca Cola drink you crave? No. But it uses that sugary goodness as energy and stores anything else you eat. Will you inflate like a balloon if you go to someone's birthday party and you eat a little cake and drink some champagne? Not really, unless people start looking at you funny while you are the only one eating the cake and there is not much left. This doesn't apply if you go to birthday parties every day!

And, remember, this is for a person that makes no physical effort whatsoever other than getting up each morning and going to work. Riding your bike at work or jogging or doing physical exercises will make this go even faster. They don't have to be difficult or complex. Finding your food in restaurants or shops is difficult, therefore you should prepare your lunch at home. That might look strange, to carry your own food, but weigh it (pun intended) against the purpose of your diet and the benefits of not being fat.

The thing is that after you reach your desired weight, you still have to keep up with this kind of eating. It keeps you fit. You can eat normal food, go to restaurants, drink sugary drinks and alcohol, but remember to balance it with the diet you just went through. Sugar and alcohol is somewhat like a really unhealthy fruit, fat is like a lot of meat, not eating vegetables is just not good. Your body is the result of evolution working on people who ate mostly plants and sometimes a little meat. Consider that when you order your double cheese hamburger and a diet Coke. Anyway, speaking of diet soft drinks, it doesn't really work. Drink the regular ones (unless you like the taste of the diet ones) just don't overdo it. It's the same with the food. Eat it rationally, even if you like to eat.

I felt no real feeling of hunger during this diet. The 5 meals a day thing keeps you full of energy and always with something in your stomach. You might feel hungry in the evening and then, if you reeeeeally find it hard to resist, drink a small glass of milk. Not regularly, just when you are really hungry. The milk makes your hunger go away, but also keep your body from losing weight.

The Routine

Well, you noticed it. The whole diet is based on daily routine. It might drive you crazy for a while, but consider that you have been doing it anyway. You wake up in the morning, wash, piss, drink the ritual coffee or whatever, get dressed, go to work, do the same thing you do every fucking day, return home and watch TV or spend "quality time" with the wife or play some silly game on the computer or even work part jobs. Then you go to sleep. It's a routine.

All you need to do is alter it a little to suit your needs.

People worst than you have done it. People incarcerated and tortured every day found solace in the real life inside their heads while living the reality in a state of trance brought on by routine. Routine is the basis of one's comfort. You only explore when you leave your comfort zone. You can do it "in your free time", if this mythical thing even exists, or just in your head, while your body is running the automated program you set up. Now I am not really talking about losing weight here, I am going general. The thing that keeps everyone content and not going crazy is not the quality of life, but the routine of it. We train ourselves to function in the given conditions.

Talking about drugs you cannot not talk about Trainspotting. The film that old people took as a drug promoting movie while for all the people that actually watched it, it was a powerful anti-drug statement. There is this part of the movie where Renton tries to quit heroin and he just hates his life. It's not because of the withdrawal symptomps, but because of the sheer boredom of it. When something that defines your pleasure is taken away, when the excitement is gone, your life feels like a slow death. It's easy to just go back to the pleasure you know. Even if it is just as boring, it feels good. But routine is what saves your ass.

The same applies to food. You dream of all the possible combinations of taste and texture and what you would do if you weren't on the diet. You don't feel hungry, you see, you don't need food, you need the pleasure of it. You want to fuck that food with your mouth and make it cum saliva. But it's all fake. When you slip and try to eat that food you realise that the fat you craved makes you feel sick, the food you want may be not on the diet sheet, but you can just about make it with those ingredients. You don't need junk food to eat good. And that is what this is all about after all. You get healthier by adjusting some parameters, not by losing something. Even if you can't lose all the weight you wanted, who gives a damn if you are still a little fat? You are a healthy fat guy! Not a morbidly obese piece of blubber that just waits miserably for their coronary.

So stick to the routine. Experiment with food while keeping to the principles of the diet when you get out of it. Stay in the comfort zone and leave it for short periods of time when you feel good about yourself. Choose your next stage of development and get to it when you feel like it.

Now I sound like those people trying to sell you stuff. I wanted to write this for a long time, mostly because I had to explain all of this to a lot of people over and over again. Now I have this post, you can read it here. I guess the bottom line is that your body as well as your brain is always in training mode. Whenever you do something that feels good, you train it to want more, whenever you do something that feels bad, you train it to want less. So it is your responsability, after all, to choose the things that make you feel good.

Wednesday, September 24, 2008

Watin FileUpload issues

Usually when I blog something I am writing the problem and the solution I have found. In this case, based also on the lack of pages describing the same problem, I have decided to blog about the problem only. If you guys find the solution, please let me know. I will post it here as soon as I find it myself. So here it is:

We started creating some tests for one of our web applications. My colleague created the tests, amongst them one that does a simple file upload. She used the following code:
var fu = ie.FileUpload(Find.ByName("ctl00$ContentPlaceHolder1$tcContent$tpAddItem$uplGalleryItem$fuGalleryItem"));

and it worked perfectly. She was using WatiN 1.2.4 and MBUnit 2.4.

I had Watin 2.0 installed and MBUnit 3.0. Downloaded the tests, removed the ApartmentState thing that seems not to be necessary in MBUnit 3.0, ran them.
On my computer the FileUpload Set method opens a file upload dialog and stops. I've tried a lot of code variants, to no avail; I've uninstalled both MBUnit and WatiN and installed the 1.2.4 and 2.4 versions. Tried all possible combinations actually, using .NET 1.1 and 2.0 libraries and changing the code. Nothing helped. On my computer the setting of the file name doesn't work.

I've examined the WatiN source and I've noticed that it used a FileUploadDialogHandler that determines if a window is a file upload window or not by checking a Style property. I have no idea if that is the correct solution, but just to be sure I inherited my own class from FileUploadDialogHandler and I've instructed it to throw an exception with a message containing the style of the first window it handles. The exception never fired, so I am inclined to believe that the handler mechanism somehow fails on my computer!

I have no idea what to do. I have a Windows XP SP3 with the latest updates and I am running these tests in Visual Studio 2008 Professional.

The only possible explanation left to me is that Internet Explorer 8 is the culprit, since my colleagues all have IE7. The maker of WatiN himself declared that identifying the windows by style is not the most elegant method possible, but he had no other way of doing it. My suspicion is that the window handling doesn't work at all in IE8, but I have no proof for it and so far I have found no solution for this problem.

Monday, September 22, 2008

Very Slow USB Memory Stick in Windows XP

I finally bought myself a laptop and I had to figure out how to transfer files from my computer to the new device. I'll spare you the details, the thing is that I finally decided to use a 4Gb flash stick that my wife had from work.

I wanted to copy about four episodes from a tv series (that's less than 1.5Gb) and it said it needed to do it in 30 minutes. I thought, well, maybe the stick is slow. I was in no hurry, so I let it copy. I noticed that it worked in great bursts of data. First copy fast, then wait, then copy again, then wait again. When I moved it to the laptop, the file system was corrupt. What the.. ? So I formatted the stick. Apparently, I had only the FAT32 option, not NTFS. Then started copying again.

I Googled for it and found out that Windows XP does not enable write caching for removable drives. Me being me, I immediately went to the hardware properties of the stick and changed the way it worked from 'Optimize for quick removal' to 'Optimize for performance' (which specified that it enabled write caching). Wow! I am so smart. But I continued Googling anyway and I've learned that the setting doesn't really change anything, other than giving you the option of formatting with NTFS, which then would allow write caching.

But there was also another option, even with 'Optimize for quick removal' on: use the Windows XP command line utility Convert which is used to convert a FAT drive to an NTFS drive. I stopped the copying, only to notice that the file system was corrupt again. I deleted the files, ran chkdsk K: and then convert K: /fs:ntfs /v /NoSecurity. While the copying went a lot smoother, it still took 15 minutes to copy the damn thing. At least I could read it at the other end, anyway.

I don't exclude the possibility that drivers or stick hardware were at fault (since it is the first time this is happening to me), but be aware that you can always have this option of using NTFS instead of FAT32.

Disadvantages of using NTFS and write caching:
  1. You need to use the software option of removing the USB stick and wait until it says it is safe to remove it, otherwise you might have write errors
  2. NTFS has this ugly write last access time option that you can only remove it through a registry hack.
  3. NTFS sticks cannot be used for some devices like mp3 players and such, since they only know FAT32 access. Windows 98 is also oblivious to NTFS, although there are third party NTFS drivers for it

Almost all information here can be found in more detail at this link: Tips for USB pen drives.

Thursday, September 18, 2008

The Cobweb by Neal Stephenson and J.Frederick George

The Cobweb is not a sci-fi story, just a fiction thriller. It happends in modern day America, where a small town cop slowly unravels a plot of international proportions and implications. He has to foil it with no help from (or rather against) the corrupted systems of university academia and government security and diplomatic agencies.

Actually, this is the main subject of the book, if I can say so: Throat cutting internal politics inside the CIA, the rule that CIA operations cannot take place inside the borders of the USA, and they ways to bend that rule, university scholarship stewards that live off foreign student exchanges (real or not) and bogus grants, etc. It was a bleak picture, the one painted of the CIA employees who cannot exceed their assigned duty, even if they have plenty of reason to, else face career stop or even dismissal.

In the end, of course, Deputy Sheriff Clyde Banks saves the day, but I can't help noticing that I knew this would happen from the very start. The real information is in the path to the end result and that is what I've appreciated in this book. The reader is taken away to discover the filthy world Stephenson and George expose.

It starts a little slow. It also provides plenty of information for would be terrorists :) So I recommend it to everyone, even if it is not a sci-fi book, it's a solid well made story.

Tuesday, September 16, 2008

offsetParent null in FireFox. Absolute position in FireFox yields 0,0

I've spent about a day on a thing that I can only consider a FireFox bug. As a complete reverse from what I would expect from a javascript script, it worked anywhere but in FireFox! And FireFox 2.1, I haven't even installed 3.0 yet.

It concerned a simple javascript function from a third party that was supposed to get the absolute positioning of an element when clicked. I've written one myself a while ago, but it didn't work either! Here is the function that I was trying to fix:
function getPos(n) {
var t = this, x = 0, y = 0, e, d = t.doc, r;

n = t.get(n);

// Use getBoundingClientRect on IE, Opera has it but it's not perfect
if (n && isIE) {
n = n.getBoundingClientRect();
e = t.boxModel ? d.documentElement : d.body;
x = t.getStyle('html')[0], 'borderWidth'); // Remove border
x = (x == 'medium' || t.boxModel && !t.isIE6) && 2 || x; += != ? 2 : 0; // IE adds some strange extra cord if used in a frameset

return {x : n.left + e.scrollLeft - x, y : + e.scrollTop - x};

r = n;
while (r) {
x += r.offsetLeft || 0;
y += r.offsetTop || 0;
r = r.offsetParent;

r = n;
while (r) {
// Opera 9.25 bug fix, fixed in 9.50
if (!/^table-row|inline.*/i.test(t.getStyle(r, "display", 1))) {
x -= r.scrollLeft || 0;
y -= r.scrollTop || 0;

r = r.parentNode;

if (r == d.body)

return {x : x, y : y};

As you see, it is a little more complex than my own, although I don't know if it works better or not.

Anyway, I found that the problem was simple enough: the element I was clicking did not have an offsetParent! Here is a forum which discusses a possible cause for it. Apparently the Gecko rendering engine that FireFox uses does not compute offsetParent, offsetTop or offsetLeft until the page has finished loading. I didn't find anything more detailed and there were just a few pages that seemed to report a problem with offsetParent null in FireFox.

I tried to solve it, but in the end I gave up. My only improvement to the script was this line:
while (r&&!r.offsetParent) {
which resulted in a more localised position, i.e. the position of the closest parent to which I could calculate a position.

In the end the problem was solved by restructuring the way the dynamic elements on the page were created, but I still couldn't find either an official cause or a way to replicate the issue in a simple, separate project. My guess is that some types of DOM manipulations while the page is loading (in other words, scripts that are just dropped on the page and not loaded in the window 'load' event which change stuff in the page element tree) lead to FireFox forgetting to compute the offset values or just even assuming that the page is never loaded.

Monday, September 15, 2008

Installing WinXP on a Laptop with SATA drives, but no floppy

The problem is that Windows XP only accepts serial-ATA drivers from the floppy diskette. You don't have one. What are you to do?

Google gave me a few ideas, but I had to piece them together. So, here is what you need:
  1. download the SATA drivers for your laptop
  2. download Virtual Floppy Drive by Ken Kato - at this time it is at version 2.1 - if your SATA drivers are in the form of a floppy disk image generator (a single exe that asks for a disk in the floppy drive)
  3. download nLite - at this time
  4. get a windows XP installation disk
  5. have a writable CD and a CD writer ready

And here is what you do on another computer (one that is working :)) :
  1. (if you already have the SATA driver files, skip this)Go to My Computer, Properties, Hardware, Device Manager and disable your floppy drive, if you have any
  2. (if you already have the SATA driver files, skip this)Install and run Virtual Floppy Drive and create an empty image on A:
  3. (if you already have the SATA driver files, skip this)Run the stupid SATA drivers you downloaded - that only want a diskette to format with their files, they can't simply unzip them somewhere - and you will get the drivers in file form on the virtual diskette
  4. Start nLite and select the Drivers and the Bootable ISO options then Insert multiple drivers and give it the drive A: as a source
  5. Create the ISO image, then write it on a blank CD - writing CDs from images is different from just dragging and dropping files in the CD writer window, BTW

That should do it! Use the newly created CD as the XP installation disk without pressing F6 for loading SCSI drivers.

Clone Detective for Visual Studio 2008

I've stumbled upon a little VS2008 addon that I think could prove very useful. It's called Clone Detective. Here is how you use it:
  • Make sure VS2008 is closed
  • Download and install the setup file
  • Additionally the source is freely available!
  • Open VS2008 and load a solution up
  • Go to View -> Other Windows -> Clone Explorer
  • Click the Run Clone Detective button

Now you should be able to see the percentage of cloned code in each file and also see the cloned code as vertical lines on the right vertical border next to the code.

Karishma - a very nice Indian restaurant in Bucharest

really flashy, isn't it? We intended to go to Thang Long, the Vietnamese restaurant, but so it happends that it was closed. On the same street there was this large, Indian looking (elephants and all), restaurant: Karishma. It looked too flashy, that being the reason why I usually suppressed my curiosity of going in there, but that day I felt curious enough.

As it turns out, it is the real deal: good food at reasonably high prices, real Indians running the shop, even serving the food, and nice people (at least with their customers), beautiful interior. Four people ate enough with 215 lei, that is about 60 euros or 84 USD. We tried the lassi drinks (yoghurt with mango or with cumin and salt), the masala tea (black tea, ginger and cardammon with lots of milk and sugar) and saucy meat foods. We could choose the level of spicyness, I asked for very spicy and it was heaven. My table mates called it poison and stuck to their own meals.

All in all, I highly recommend it. Here is the address and contact information:

Karishma restaurant
Address: Iancu Capitan 36, Bucharest,
Phone: 0040-21-252.51.57

Sunday, September 14, 2008

Other pictures from Greece

In the blog posts about my trip to Greece I placed some of the pictures taken, but mostly stuff pertaining to the place we were in and the paragraph before. Here are other pictures that have no special meaning other than that I like how they turned out.

black ants in Greece are slightly bigger than in Romania
Green orange tree
some park in Sparti
Sparti cat
Sparti dog

Let's Sparti!
entering Thessaloniki
street in Thessaloniki
old church in Thessaloniki

plants in Thessaloniki
plants in Thessaloniki
plants in Thessaloniki
plants in Thessaloniki
plants in Thessaloniki

Thessaloniki cat
Praying mantis in Kyparissi
plants in Kyparissi
plants in Kyparissi
plants in Kyparissi

Kyparissi plants
Kyparissi plants
Kyparissi plants
Kyparissi plants
Tzerry, the dog