Blog Image

MikeK's software notebook

What you will find

This used to be the place where I wrote stuff I was thinking about while working on the Mozilla project.

Maybe in the near future I'll start to update it again as I'm involved in a couple of new open-source projects - updates pending...

Is it wrong to make a profit on the work of others?

I wonder Posted on 13 Jul, 2010 11:30:47

Український переклад

The open web app store idea got some attention at the Mozilla Summit. I blogged about the idea of anyone being able to create sites to “sell” applications created by others, fortunately it seems like that is also what Pascal had in mind (correct me if I got it wrong).

I don’t think there is a big opposition to let the creators of web apps earn money on their creations – what could be discussed is whether or not the store should be able to take a piece of the cake, and if the creator should have a saying as to what price the end user pays, I’ll get back to this in a moment as that is what this blog post is mainly about.

First of all I would like to note that as there are hardly any borders on the Internet I don’t believe that any ideas about having different prices or release dates on different markets is something that belongs in the current world.

What I do believe is that the price of the same product can be different depending on which store you buy it in and I think it is fair that the store charges for its service.

If I create some application I should of cause be able to decide the price of the application, meaning I should be able to say how I want to be compensated for someone else to use my creation, I can set the price, I can choose that I don’t want any compensation, that I want one dollar, or that I want a trillion billion dollars for it.

The big question is now, after I have sold the application to a third party (think the store), should I still be able to decide what price the third party takes, if the product is resold? Personally I would be pretty annoyed if I bought a TV set and the producer of that TV set could decide what price I could resell it for, say Sony said that I can’t resell my Sony TV for any price less than what I paid for it originally, or if the previous owner of my house prevented me for selling my house for more than what I originally paid.

I think the same goes for software, if I create a store of web apps, I should be able to sell the applications for less than what I pay for them, I might want to draw traffic to my store by having discounts, or I might charge an overprice for extra service, or to cover the cost of advertising to draw people to my store.

I fully understand that if I create an application and say I sell it for 99 cents, then I might feel cheated if I see a store being able to sell it for 10 dollars, but that is the world we live in, and competition between the stores will hopefully keep the prices down, when anyone is able to create a store and set the prices – If I go to a store and pay 10x the price that I later find the same product for in another store, then the store that charge 10x the price lost me as a costumer for the future.

My point is that if the stores can see a business case in selling apps, then we will have much more choice (one store can offer a money back guarantee and another one can have an offer of the week to draw customers in).

In the real world price fixation and exclusive stores rarely benefits the consumer.

Anyone should be able to setup a store and sell whatever applications they like, as long as the original creator of the application gets the compensation that they want.

An open, open web app store

I wonder Posted on 08 Jul, 2010 02:46:03

I have just been to the talk about an open web app store, interesting and yes we need something way for the users and the developers to meet, and maybe for a small fee make their exchange.

Now the question in the session was if Mozilla in some way should be involved in creating such a beast. Yes, I think we should help to promote the exchange off apps – but should it be in some kind of store? – The problem with creating such a thing, in my eyes, is that its like saying there should only be one search engine, and it should be the creators of that engine that should decide how the individual apps are judged.

Now you could say that it is free for anyone to create their own app store – but that gives the problem that my small and unknown store might not attract the developers – like you would have multiple search engines, each searching in its own set of pages – wouldn’t it be better to create some kind of protocol, so I, as a developer can host the app where I want to – could even be on my own server – and then the stores could pick the apps up from whatever place they wanted?

Wouldn’t it be nice that if I found a cool app, I could write about in a blog post – and “sell” it from the blog post, so the people that was reading it, could just – with a single click – buy it from within my post – I could link to it from my facebook wall, and my friends could buy it from within facebook? – how the technicalities of micro payments should be handled can be discussed, but it should be possible to integrate into the server that is hosting the app, by letting that communicate with the server that is hosting the selling page, or the server dealing with the payment… many people know more about how to make this secure for both seller and buyer than me 🙂

Or maybe it is just a dream that this could be possible?

Building with scripts, scratchbox and LNG (last known good)

Mozilla coding hints Posted on 15 Jun, 2010 23:03:06

(Thanks to Alex Sallin, this post is also available in Czech here.)

This blog post is about how I build using scripts instead of calling make directly.

Last known good

When you do “random” pulls from Mozilla-central you are rarely sure what the state of the tree is before you pull, it takes some time before tinderbox get updated and it might be burning red on a platform that you don’t care about or it might not have a build for the special configuration you are working on….

To solve this problem I have two cron scripts running on my machine, one that pulls the latest and greatest from mozilla-central every half hour, and one that tries to build what ever was pulled, if the build succeeds with all the configurations that are important to me, it will tag the revision so I can update to it when I need to – that means that at all times I can update to a known good revision, I do incremental builds, and while this is not always giving the same result as a clean build, it is close enough for day to day use.

(The rest of this post is mostly relevant in a Linux environment)

Doing scratchbox builds without being in scratchbox

The most tricky part of creating the scripts were to make the compile in scratchbox (when building for Maemo) – I was already doing:

$ sudo mount –bind ~/MozillaCode /scratchbox/users/mike/home/mike/MozillaCode

This lets me share a single copy of the source code between scratchbox and my normal development directory (The above command will make ~/MozillaCode contain the same stuff when in scratchbox and when you are outside scratchbox for a user called “mike”, remember to create the empty target directory before executing the command, and you can’t do it the other way around mounting the scratchbox directory in your home directory).

My first attempt at building in scratchbox, without first logging into scratchbox, involved calling a script running in scratchbox that did the building – but then I found that it is actually possible to use the same environment in scratchbox as you were in when launching scratchbox, this is done by using the -k flag, and you can even shift into a specific directory with the -d flag.

So to build, my script sets up the correct MOZCONFIG and then executes something like:

$ scratchbox -d “$sourceDir” -k make -f

where sourceDir points to the mozilla-central that I want to build.

Doing scratchbox builds without being in scratchbox from a cron script

Now, my Linux knowledge is too limited to know the reason why – but in order to run the scratchbox command from a script run from cron, you need to first do:

export USER=mike

and then the scratchbox command, if you happen to share my user name, otherwise substitute your own user name smiley.

Only getting the relevant output

When we build we get a lot of build output that is not very relevant in most cases, what is most important is the errors that can be hidden inside the output, they are especially hiding if you are running multi threaded builds.

The first thing I did was to add a:

mk_add_options MOZ_MAKE_FLAGS=”–no-print-directory”

to my mozconfig files, as that takes a lot of unneeded information away – I’m sure it’s useful for some people to see where the build script is, but not for me.

The other thing I did was to pipe all the standard output to a file, leaving only the error output, by executing make like:

$ make -f > stdoutLogFile.txt

But as I wanted to be able to get the error output to see failures or successes from my cron script I also needed to pipe the error output, that is done by using the 2> pipe:

$ make -f > stdoutLogFile.txt 2> stderrLogFile.txt

In my actual script the files a renamed so I can see which build generated them, and a _OK is postfixed to the filename if the build was successful, and _Fail if there were build errors – this enables me at a glance to see which platforms are currently building and which are failing, and to get the build error I just need to open the file(s) postfixed with _Fail. Very easy and very convenient (I can even see which target is being build at the moment as that gets a _Building postfixed).

At this very moment the Result* content of my LatestBuild directory is like:


These correspond to the “stderrLogFile.txt” in the above make example, but for each platform I’m building for.

It can be seen that right now everything seems fine for the platforms I care about in my daily work.

What I can do now to get the latest code, in my working directory is the execute

$ hg pull
$ hg update – u mikek-lng

As I’m always cloning my local repository rather than directly from mozilla-central when I start a new directory. This way I can be almost sure that any failures after the update are due to errors I made in the patch I’m working on, and not something that is coming from mozilla-central – which was my main goal.

Building with a script instead of the command line

Previously I used to have several command line windows open, one configured for each target that I wanted to build for (like for Firefox Mobile PC, Firefox Mobile PC Qt version, Firefox Mobile Maemo Release version, …) It was confusing and I was never sure on which platforms I had kicked of builds on, or which window belonged to what platform.

As creating the LNG scripts had given me a basal and dangerous knowledge of writing shell scripts on my Ubuntu box I got the idea of creating a single macro that could do all the building – iBuild was created, what I have now is a simple tool I can run that reports back to me in a very simple way if the build is a success or not.

I can do the quick and dirty, that takes forever:

$ ./iBuild all

This will simply build all the targets that I found relevant as seen above, or I can specify what I’m currently interested in like:


(FF_* = Firefox, FFM_* = Firefox mobile, *_P* = PC, *_N* = Maemo (Nokia), *_?R = Release, *_?D = Debug, _QT = Qt version, otherwise it defaults to Gtk) It’s all done with a big table and a number of Mozconfigs, but I hope one day it will auto generate the Mozconfigs.

A build cycle now looks something like (Note, scratchbox Maemo builds are handled inline):

Building from /home/mike/MozillaCode/100614
parameter is FFM_NR FFM_NR_QT FF_PD

Building FFM_NR
FireFox Mobile Nokia Release
Using Scratchbox for building
Build success

Building FFM_NR_QT
FireFox Mobile QT Nokia Release
Using Scratchbox for building
Build success

Building FF_PD
Firefox PC Debug
Build success

All builds were ok

The bottom line is the important one – it tells me with a single glance that everything went well, where before I need to go to each individual command line window – if it had failed it is easy to see which build failed, and I have the error log in a file saved on the drive to see the exact error message.

My iBuild script is by no means finished, it contains very little error handling, it is constantly evolving, while it works for me it might not work for anyone else, and while I didn’t intend for it to launch an ICBM attack when executed it might do, use at your own risk.

However, feel free to be inspired by it and create your own modified version – it can be found here and must be executed in the directory that contains your mozilla-central, e.g. the directory below where you would usually execute make from.

The currentSyncTarget() function contains the translation between command line arguments and mozconfig files, and the buildAll() function defines what the “all” command will build.

The script that is called from cron does a

hg pull
hg update -C

and then calls iBuild all to do the actual building, its protected against multiple executions in the same simplified way as used in the iBuild script with a lock file.

If iBuild is successful it then uses hg to tag the version

hg tag -f mikek-lng

Any feedback will as usually be appreciated.

mozilla-central/widget/src/qt/nsWindow.cpp:259: error: ‘OpenGL2’ is not a member of ‘QPaintEngine’

Why did I reboot? Posted on 28 May, 2010 16:03:24

This was one of the more easy errors to fix – that is _after_ I realized what the problem was, because the error message didn’t help at all…

So if you get something like:

mozilla-central/widget/src/qt/nsWindow.cpp:259: error: ‘OpenGL2’ is not a member of ‘QPaintEngine’

when you build in Scratchbox (Fremantle), then the problem is that your Qt dev files are out of date, to fix it, log into scratchbox and do:

> apt-get update
> fakeroot apt-get upgrade

After this you will need to re-install some of the Qt stuff

> apt-get install libqt4-gui

I had to delete my build directory (the one MOZ_OBJDIR points to in your mozconfig) afterwards too.


This post is also available in Belorussian, translated by Patricia Clausnitzer, posted on Fatcow.

configure: error: Couldn’t find curl/curl.h which is required for the crash reporter. Use –disable-crashreporter to disable the crash reporter.

Why did I reboot? Posted on 29 Mar, 2010 10:08:37

Of cause you can disable the crash reporter if you get this error when building Firefox/Fennec/what ever you are building. But you can also choose to continue building it by installing the missing files.

On an Ubuntu system you can install the curl development files by doing:

$ sudo apt-get install libcurl4-openssl-dev

This will use OpenSSL for SSL support, if you prefer the GnuTLS, then install libcurl4-gnutls-dev instead.

configure: error: Can’t find header GL/glx.h for WebGL (install mesa-common-dev (Ubuntu), mesa-libGL-devel (Fedora), or Mesa (SuSE))

Why did I reboot? Posted on 10 Feb, 2010 15:26:54

If when building in Scratchbox you get an error like:

configure: error: Can’t find header GL/glx.h for WebGL (install mesa-common-dev (Ubuntu), mesa-libGL-devel (Fedora), or Mesa (SuSE))

The solution is to update your mozconfig with the following line:

ac_add_options –with-maemo-version=5

for a complete mozconfig check:

How to setup the build environment to build Fennec for Qt with the MAEMO 5 SDK (Fremantle) in scratchbox on Ubuntu

Why did I reboot? Posted on 02 Feb, 2010 14:35:04

Note: the following instruction should also work on top of previous installations of scratchbox, you don’t need to uninstall the previous version, but you should be logged out of it.

Install the Nokia development kit for the N900 (MAEMO 5 SDK). I choose:

Run it by:
$ sudo python ./

(If you don’t have python, you need to install it first, follow the on-screen instructions)

This will launch the GUI installer, chose to install the development version. This will install scratchbox and most of the libraries that you will need.

Launch scratchbox:
$ scratchbox

Now you can choose the platform you want to work on with the scratchbox tools:

> sb-menu

Choose “FREMANTLE ARMEL” to run code natively on the N900.

You now need to install a few libraries from inside scratchbox:

> apt-get install python2.5 libqt4-core libqt4-gui libqt4-dev libidl-dev

There are two important things you will need to remember in your mozconfig:

1) tell it to use Qt:
ac_add_options –enable-default-toolkit=cairo-qt

2) Currently we also need to disable crashreporter as it hasn’t been converted to Qt
ac_add_options –disable-crashreporter

For reference my MOZCONFIG is NR_QT_NokiaRelease_mozconfig.txt:

# cs2007q3 gcc 4.2 is busted, we think, and doesn’t
# look in the expected places. –dougt.
# $PWD/… was added due to bug 463076
export LDFLAGS=”-Wl,-rpath-link,$PWD/dist/bin/:/usr/lib:/lib”

# Options for
mk_add_options MOZ_BUILD_PROJECTS=”xulrunner mobile”
mk_add_options MOZ_OBJDIR=@TOPSRCDIR@/../objdir-fennecNokia-qt-release
mk_add_options MOZ_MAKE_FLAGS=”-j3 –no-print-directory”
ac_add_options –enable-gstreamer

# Global options
# ac_add_options –enable-debug
# ac_add_options –disable-optimize

# XULRunner options
ac_add_app_options xulrunner –enable-application=xulrunner
ac_add_app_options xulrunner –disable-javaxpcom

# Enabling –with-arm-kuser implies Linux on ARM and enables kernel
# optimizations for that platform
ac_add_app_options xulrunner –with-arm-kuser

# Disabling tests due to bug 454881
ac_add_options –disable-tests

# mobile options
ac_add_app_options mobile –enable-application=mobile
ac_add_app_options mobile –with-libxul-sdk=../xulrunner/dist

# enable qt
ac_add_options –enable-default-toolkit=cairo-qt
ac_add_options –disable-crashreporter

Scratchbox: Temporary failure resolving ‘’

Why did I reboot? Posted on 12 Nov, 2009 21:26:06

So there I was happily writing documentation for the GStreamer integration – came to think of something that could be neat (read simplify the implementation) just needed to do a little test first that wouldn’t take more than a build and a download… yeah right 🙂

So my test were running fine on the PC, and I wanted to re-run the test on the N810 as if it works there and on the PC it will probably run everywhere – I just needed to install an extra package in scratchbox in order to do the test – verified with “apt-cache search” that the specific package was known (its some times a problem running under scratchbox that the packages are different than in Ubuntu or are missing altogether) anyway I fired up “apt-get install ….” and it failed:

Temporary failure resolving ‘’

hmmm… so I tried:

[sbox-CHINOOK-ARMEL-2007: ~] > apt-get update
Ign file: chinook Release.gpg
Ign file: chinook Release
Err chinook Release.gpg
Temporary failure resolving ‘’
Err chinook Release.gpg
Temporary failure resolving ‘’
Failed to fetch Temporary failure resolving ‘’
Failed to fetch Temporary failure resolving ‘’
Reading package lists… Done
E: Some index files failed to download, they have been ignored, or old ones used instead.
[sbox-CHINOOK-ARMEL-2007: ~] >

A quick ping to #mobile didn’t generate any response – searching the web first gave some hits several years old – but really no responses, except “ohh… it says its a temporary error, try again later” – yeah right – there were one guy who had the trouble for two weeks – I didn’t think I would want to wait that long and still not have a solution.

So thinking back on what I had done in the last few weeks – reformatting my harddrive to ext4 and installing ubuntu 9.10 – hmm… couldn’t be that, as I had gotten and installed several packages since then to be able to build fennec…. not a strong indicator that it had anything to do with that upgrade…

I could download the packages fine outside of scratchbox, so it wasn’t a server problem either – tried it and installed the “traceroute” package manually (by downloading it, copying it to scratchbox and running “dpkg -i traceroute_1.4a12-21_armel.deb”) that worked fine – I couldn’t traceroute anything… hmm… network issue? ssh worked fine for remote machines – so the network seemed to be fine, also from within scratchbox…

Enough babbling – the problem was the “/scratchbox/etc/resolv.conf” file that apparently holds the address of the nameserver – this pointed to my old router, and not the new one I installed last week – after fixing the ip address in there to point to the new router everything worked nicely again – why I haven’t had that problem before is beyond me…

Next »