Loud ramblings of a Software Artisan

Friday 1 February 2019

Music, Flathub and Qt

I have started reading recently about music theory and such, with the purpose to try to learn music (again). This lead me to look at music software, and what we have on Linux.

I found a tutorial by Ted Felix on Linux and MIDI

I quickly realised that trying these apps on my Dell XPS 13 was really an adventure, mostly because of HiDPI (the high DPI screen that the PS 13 has). Lot of the applications found on Fedora, by default, don't support high DPI and a thus quasi impossible to use out of the box. Some of it is fixable easily, some of it with a bit more effort and some, we need to try harder.

Almost all the apps I have tried used Qt. With Qt5 the fix is easy, albeit not necessarily user friendly. Just set the QT_AUTO_SCREEN_SCALE_FACTOR environment variable to 1 as specified in Qt HiDPI support documentation. There is also an API to set the attribute on the QCoreApplication object. There must be a good reason why this opt-in and not opt-out.

This is the case of Muse3 (aka muse sequence not to be confused with MuseScore which work out of the box, at east from Flathub), and Rosegarden.

LMMS and Hydrogen on the other hand are using Qt4 on Fedora 29. The good news? They both have been ported to Qt5, so it is just a matter of building these newer versions. After doing so, they still need the workaround described above.

This is where Flathub comes into play: make them available on Flathub, where we can set that environment variable for the runtime.

In the end, I have Hydrogen available on Flathub, the three others in queue for Flathub, and all have had patches submitted (with Muse3 and Rosegarden already merged upstream).

Now what other cool Free Software music oriented application exist?

Monday 10 December 2018

Microsoft vs the web

I have been saying for a few years now that Chrome is the new IE, and the Google is the new Microsoft (Microsoft being the new IBM). This statement have been somewhat tongue in cheek, but I have always been serious about it not being a joke: history is repeating. I could got at length on all the reasons why I believe this to be true, but I’ll just talk about one new development.

Last week, Microsoft announced that they had decided to abandon EdgeHTML, their web browser engine, and move to be using Google’s Chromium as the heart of the web browser offering, Edge. [1] Whether it will be just Blink and V8 (Web rendering and JS engine respectively) or also parts of Chromium is something unclear.

The takeaway from their statement is:

  • Because web developers seem to only care about Chrome, Microsoft believe in the short term gain of using Chrome for web compatibility since EdgeHTML is lagging behind. They view web compatibility as a single web runtime, not as better and diverse standard implementation:

“1. We will move to a Chromium-compatible web platform for Microsoft Edge on the desktop. Our intent is to align the Microsoft Edge web platform simultaneously (a) with web standards and (b) with other Chromium-based browsers. This will deliver improved compatibility for everyone and create a simpler test-matrix for web developers.” [2]

  • They view their own EdgeHTML code base unportable even on their own operating system, too tightly bundled:

“We will evolve the Microsoft Edge app architecture, enabling distribution to all supported versions of Windows including Windows 7 and Windows 8, as well as Windows 10. We will also bring Microsoft Edge to other desktop platforms, such as macOS”[3]

  • EdgeHTML being too tightly coupled into Windows, baking it into Windows 10 updates, this is the self inflicted wound that prevent improvement of just that component or even security issues without updating the whole OS:

“If every Edge user were using the very latest version of Edge, it wouldn't be so bad, but that's not the case, and that's because of how Microsoft has bundled Edge with Windows 10. Most home users will end up running the latest feature update to Windows 10 within a few months of its release. But enterprise users are more diverse. This means that Edge, already a relatively small target for Web developers to think about, suffers major version fragmentation. Contrast this with Chrome, where within a few days of a new version coming out, almost the entire user base is migrated.” [4]

  • This will bring somewhat better parity with Edge on mobile platforms as the Android version of Edge is based on Chromium. (iOS remain the exception, but I’ll leave that for another day)

Microsoft recognized that they failed at reconquering the web.

One thing is clear is that Microsoft will contribute (or try) to Chromium, Blink and V8 to make these better for them. Remember Blink is a fork of WebKit because Google couldn’t work with WebKit major sponsor, Apple, so this may not be a done deal.

The other clear thing is the little marketshare Edge took away from Chrome as an alternative implementation will be aggregated into Chrome’s. Microsoft is actually helping the hegemony of Google, their competitor in several other market, like Bing, Hotmail, Azur, into controlling the web browser space, losing any leverage for web standards.

I wish they had gone the Mozilla route. Not as easy as the one they chose, but still probably way easier as their current situation, and helping Mozilla is helping the web stay relevant as an open standard.

Mozilla’s mission has become even more important than ever and if you wanted to do something useful for the future of the web, just use Firefox, and ensure, if you are a developer, that everything runs smoothly with it.

Ferdy Christant state of the web browser is a relevant read into the whole situation. So is part 2

Thursday 9 August 2018

AbiWord: It's alive

No, AbiWord is not dead. The project has been very very slow moving due to very little contributions. But sometime a spark ignite things.

Martin needed to fix a major issue we had related to how AbiWord uses Gtk wrong

Then I took some time to perform two long overdue tasks:

1. Moving to git. We have had a complete mirror of the Subversion repository for a few years on Github. But now that GNOME moved to their own Gitlab, I felt that this was a more appropriate place for AbiWord. So we did. Thanks to Carlos Soriano, AbiWord is now hosted in GNOME World.

2. Making AbiWord available as a Flatpak. This would alleviate all the issues plagging us wher distribution never update the app, even on the stable branch. For a while I had a manifest to build it, all it needed were a few adjustments and reaching the hosting stage.. Thanks to the Flathub team, AbiWord 3.0.2+ is now available on Flathub. Check it out.

The next steps:

1. Release 3.0.3 (the current stable branch) and update the Flatpak.

2. Release 3.next (I think it will be 3.2) as the next stable branch. This one includes a large set of fixes that should fix lot of crash and memory corruptions.

3. Work on more incremental release. With Flatpak this will be easier.

4. Gtk4 port: we might hit some roadblock with that.

Thursday 3 May 2018

libopenraw 0.1.3

I just released libopenraw 0.1.3. It has been a year since the last release. It mostly add new cameras.

I already have a branch for a new feature that will introduce Rust, for the next release.

Friday 20 October 2017

Lightroom exporter/importer

Importing from legacy proprietary software is always important to liberate one's data and help migrate to libre solutions. To that effect, I wrote, in Rust, lrcat-extractor. A library (crate in Rust parlance) with a command line tool to extract and dump Lightroom catalogs, the central part of the user generated data.

Currently only the 2-version-ago Lightroom 4 catalogs are supported as it is the only version I have. Supporting Lightroom 6 is just a matter of me purchasing it and spending some time.

The goal is of course to use this crate in Niepce to import Lightroom 4 catalogs directly.

Head to the lrcat-extractor repository if you are curious.

Also as aside note, I also have the equivalent for the now discontinued Apple Aperture that I started to write almost two years ago as a practical exercise to learn Rust, and that I used as a base. Check out aplib-extractor.

The next step is to implement the actual import, probably filling the gaps in the API.

Update: and before someone says it, the timing of this project is quite coincidental.

Thursday 19 October 2017

Porting to Rust

I have started something crazy a few weeks ago: porting my app from C++ to Rust.

I'll talk about it here, in several parts.

This is Part 1.


Once upon a time I started to write a Free Software photography application for GNOME. That was around 10 years ago.

At the time I chose to use C++ and Gtkmm 2.x. C++ because it provide unprecedented interoperability with existing C code while offering a stronger type system, RAII and template which allow safer programing pattern without a large overhead at runtime. Absolutely no regrets on that.

Over the years, despite never having been released, the code base was updated to C++11 and Gtkm 3.x.

Meanwhile Rust was developed at Mozilla. And Rust provide safety and high performance, but doesn't have the same ease of interoperability with a C codebase - albeit still make it easy.


Why am I porting to Rust?

  • Rust is designed to be a safe language. In this day an age of 0day due to the use of C (and C++) and their unsafe practice, it is something important.
  • I want the application to be multi-threaded (it already is in design), which is one of the design goal of Rust.
  • I started writing other stand alone components in Rust.

The goal is to progressively rewrite the application in Rust, mostly translating the logic from C++ to Rust. It is not the first time I do that across languages. I'll go from the back to the front, the front being the UI written with Gtkmm.

Several steps.

Build system

There have already been a few posts about integrating Rust into an automake based build system, by Federico or by me. The gist of it is to make sure you build your library with Rust and link it to your application.

Building Rust code is trivial using cargo. There is no point using anything else.


When doing an ad-hoc conversion, the biggest problem is calling code write in one language from the other.

You can not call Rust methods or generic functions from C/C++, so you'll need to have some interface in place. Rust has made it relatively easy by using C calling convention and making easy to declare a function with mangling disabled.


pub extern fn my_function(arg1: &Type1) -> RetType {


From C/C++ this would be something like:

extern "C" RetType my_function(const Type1* arg1);

Here we assume that RetType is a POD (Plain Old Datatype, like int, char, float), not a C++ class, nor a struct in Rust.

There is a tool called cbindgen that will generate C/C++ header from the Rust code. You can call it from the Rust build system to do that automatically. More on that later.

Calling C code from Rust is easy provided you have the proper interfaces. It is basically the reverse of the above.

But calling C++ methods on the other hand, you have to use bindgen.


Bindgen will generate from C/C++ header Rust modules to call C++ code.

The tool is pretty amazing to not say magical. Albeit there are some pitfalls.

For example a C++ class with a std::vector<> as a member will a problem if treated as opaque type. To be honest I am not sure if it would work as a non opaque type.

The use of boost might actually generate Rust code that doesn't compile.

Running the Rust test with cargo test will fail. I think it is Issue 1018.

There are a few things I recommend. Most notably, treat types as opaque whenever possible ; whitelist types and functions rather than blacklist. Your mileage may vary but this is what I got to work in my case.

Using these tools I managed to actually rewrite the large parts of the backend in Rust with equal functionality. Progressively more C++ code will be replaced.

I'll get into detail in a future post.

Wednesday 9 August 2017

Status update, August 2017


In March I joined the team at eyeo GmbH, the company behind Adblock Plus, as a core developer. Among other things I'm improving the filtering capabilities.

While they are based in Cologne, Germany, I'm still working remotely from Montréal.

It is great to help making the web more user centric.

Personal project:

I started working again on Niepce, currently implementing the file import. I also started to rewrite the back-end in Rust. The long term is to move completely to Rust, this will happen in parallel with feature implementation.

This and other satellite projects are part of my great plan I have for digital photography on Linux with GNOME.

'til next time.

Friday 10 March 2017

So, I got a Dell

Long. Overdue. Upgrade.

I bought a Dell XPS13 as my new portable workstation for Linux and GNOME. This is the model 9360 that is currently available as in a Developer Edition with Ubuntu 16.04 LTS (project Sputnik for those who follow). It satisifies all I was looking for in a laptop: lightweigh, small 13", 16 GB of RAM (at least), core i7 CPU (this is a Kaby Lake) and must run Linux well.

But I didn't buy the Developer Edition. Price-wise the Developer Edition is CAD$150 less than the Windows version in the "business" line and CAD$50 less than the Windows version in the "home" line (which only has that version). Exact same hardware. Last week, it was on sale, CAD$500 off the regular price, so it didn't matter. I got one. I had delayed so long for getting one, this was the opportunity for a bargain. I double checked, and unlike the previous Skylake based model that didn't have the same wifi card, this one is really the same thing.

I got a surprise door bell ring, from the delivery person (the website didn't tell me it was en route).


The black box

It came in a box that contain a cardboard wrap and a nice black box. The cardboard wrap contain the power brick and the AC cord. I'm genuinely surprised of the power adapter. It is small, smaller than what I'm used to. It is just odd that it doesn't come in the same box as the laptop (not a separate shipping box, just that it is boxes in a box, shipped to you at once). The nice black box only bear the Dell logo and contain the laptop, and two small booklets. One is for the warranty, the other is a getting started guide. Interestingly it mentions Ubuntu as well, which lead me to think that it is same for both preloads. This doesn't really matter in the end but it just show the level of refinement for a high-end laptop, which until the last Apple refresh, was still more expensive than the Apple equivalent.

New laptop...

Fiddling with the BIOS

It took me more time to download the Fedora live ISO and flash it to the USB stick than the actual setup of the machine, minus some fiddling. As I had booted, Fedora 25 was installed in 20 minutes. I did wipe the internal SSD, BTW.

Once I figured out it was F2 I had to press to get the BIOS upon boot, to set it to boot off the USB drive, I also had to configure the disk controller to AHCI instead of RAID, as without that the Fedora installed didn't find the internal storage. Note that this might be more troublesome if you want dual boot, but I didn't want. also I don't know what's the default setup in the Developer Edition either.

Dell XPS13 BIOS configuration

Nonetheless, the Fedora installer worked well with mostly defaults. HiDPI is managed, which that I finally have a laptop with "Retina Display".

Hands on

The laptop is small, with a HiDPI screen, a decent trackpad that works out of the box with two finger scrolling. An aluminium shell with a non glowing logo in he middle, a black inside with keyboard, and glass touch screen. The backlit keyboard has a row of function keys at the top, like it should be, row that dub as "media" button with the Fn key or actually without. Much like on a MacBook. The only difference that will require me to get used to is the Control key is actually in the corner. Like it used to be... (not sure if that's on all Dell though, but I remember hating not have control in the corner, then got used to to it like there was no option, and that was more than a decade ago). That will make for a finger reeducation, that's sure. The whole laptop a good size, it is very compact. Light but solid.

As for the connectivity, there is a SD card reader, 2 USB 3.0 port (A type) and a USB 3.1 Type-C that carries HDMI and Thunderbolt. For HDMI looks like a CAD$30 adapter, but it seems to be standard so might not be a huge problem.


I am happy with it. GNOME is beautiful in HiDPI, and it is all smooth.

Saturday 4 March 2017

Link: The Story of Firefox OS

Just a plain linkage to The Story of Firefox OS by Ben Francis. Ben retell the story so much better than I well ever be able to.

I’m incredibly proud of what we achieved with Firefox OS. If you measure its success against the original stated goals I think the project went way beyond its initial objectives. We went beyond a technical prototype to successfully ship 20 different commercial devices in over 30 countries, including smartphones and smart TVs. Apart from the incredible feats of engineering, prototyping 30 new APIs for the web platform and building possibly the most complex JavaScript codebase ever created, we built a mobile operating system from scratch and got it to market in less than two years. As far as I know no other team has ever done this.

I worked on Firefox OS for several years as well (I arrived on the project some time after Ben did) until mostly its end. I noticed there was a lot of misunderstanding in what the goal was, and a lot of questions. To me it was a great effort that tried to disrupt the market by opening the silos of mobile applications, using the web, trying to become the third mobile OS. A project a lot of people didn't think fit in Mozilla's mission. Its conclusion was much personal sadness.

Also remember, unlike Android, Firefox OS was developed in the true open source way: in the open, with community participation, and not behind closed doors with a code drop at each release, stripped down of features.

Saturday 17 December 2016

So, I'm looking for a job

I decided a while ago to avoid personal posts here with my loud ramblings, because they en up being pointless. I'll make an exception now and bring in a little life issue that is a little bit sudden.

I'm looking for a job, in snowy Montréal (I can do remote too).

If you are looking for an experienced developer to join your team, I can be that person. Even for a limited time, I'm ready to start as soon as you are.

My résumé

And may you have happy holidays.

Tuesday 13 December 2016

Rust and Automake - Part 2

I already talked about Rust and Automake. This is a follow-up.

One of the caveat of the previous implementation is that it doesn't pass make distcheck as it doesn't work when srcdir != builddir.

The main problem is that we need to tell cargo to build in a different directory. Fortunately Cargo PR #1657 has taken care of this. So all we need is a carefully crafted CARGO_TARGET_DIR environment variable to set before calling cargo. You can substitute abs_top_builddir for the top build directory.

But let's show the patch. Nothing complicated.

Sunday 27 November 2016

libopenraw 0.1.0

I just released libopenraw 0.1.0. It is to be treated as a snapshot as it hasn't reached the level of functionality I was hoping for and it has been 5 years since last release.

Head on to the download page to get a tarball.

Several new API, some API + ABI breakage. Now the .pc files are parallel installable.

Sunday 6 November 2016

Introducing GPS Ami

Once upon a time, I started geotagging my photos. For that I bought a GPS logger, an Holux M-1200E. The device works great with gpsbabel, and since my photography workflow was stuck on MacOS, I used Houdah GPS (which uses gpsbabel behind the scene, BTW). Also I have been working for too long on moving that workflow to Linux and GNOME. At one point I even started to write an app I called "Magellan" to do what that MacOS tool did, as a part of my other project, Niepce. I didn't really get motivated so it went nowhere. It was written in C++ like the rest of Niepce. The technology isn't the problem here.

Fast forward, my photography machine got upgraded to a more recent version of its operating system after the one I was using was abandonned by browser vendors, and it happens that in that upgrade, the GPS logger stopped working because MacOS 10.11 stopped providing the USB->Serial driver needed. I could install some random driver, but given how much trust I have, I decided to pass. On Linux, it still works.

I had already started rewriting Magellan in Rust using gtk-rs ; I did that as just another Rust learning project. This breakage came right as a good motivator to actually push the development of that application and make it work. And it does.

The name "Magellan" was already used for some GPS related product (not surprising), so my app became GPS Ami ("Ami" means "friend" in French).

The design

I basically reimplemented Houdah GPS, UI and such. It works OK, but I think it will act as a transitionary state. I have bigger plans.

Notably, I want to allow a better control of the device, like what bt747 can do - my logger is based on that chipset, and other automation feature so I can use GPS Ami from Niepce to download the tracks. I currently only tested with the device I have.

The implementation

As explained before GPS Ami is written in Rust and uses gtk-rs for the UI. I have to be honest, gtk-rs is not ready for prime-time, but it looks very promising and I'm very happy to be able to contribute as needed. Not surprising, but you should be ready to have to put your hands in it if you want to use it. I did just that: provided more APIs, filed some bugs, sometime fixing them. I also had to implement gudev-rs to be able to have gudev functionnality for device hotplug — to plug into the mainloop. This was a learning experience.

Rust tooling is a lot about generating a monolithic binary, without data files. This is not bad per see, but when you need data files like .ui from glade to load the UI in Gtk (albeit this not required on Gtk side, it is more convenient), you are a bit stuck. Fortunately there is the includestr!() macro in Rust which mean "load this file at compile time and put it in this string".

Another problem I had was installing the rest of the files, like the .desktop or icons, problem I solved by wrapping cargo into an automake build system.

On overall, I'm just calling gpsbabel from a UI to download file.

The future

So what should come into the future?

  • polish the UI more, possibly tweaking it. This will probably require more gtk-rs work. IMHO there are a few show stopper for a first version.
  • implement a driver for this GPS, in Rust so that the actual driver be written in memory safe language.
  • see about adding more control feature: we should be able to read the GPS values like it is done in bt747.
  • support other devices officially (I don't have any other though)

Help wanted

  • I don't have an icon
  • I only have one device to test with

Friday 7 October 2016

Rust and Automake

But why automake? Cargo is nice.

Yes it is. But it is also limited to build the Rust crate. It does one thing, very well, and easily.

Although I'm writing a GNOME application and this needs more than building the code. So I decided I need to wrap the build process into automake.

Let's start with Autoconf for Rust Project. This post is a great introduction to solving the problem and give an actual example on doing it even though the author just uses autoconf. I need automake too, but this is a good start.

We'll basically write a configure.ac and a Makefile.am in the top level Rust crate directory.


AC_INIT([gpsami], m4_esyscmd([grep ^version Cargo.toml | awk '{print $3}' | tr -d '"' | tr -d "\n"]), [hub@figuiere.net])
AM_INIT_AUTOMAKE([1.11 foreign no-dependencies no-dist-gzip dist-xz subdir-objects])

Let's init autoconf and automake. We use the options: foreign to not require all the GNU files, no-dependencies because we don't have dependency tracking done by make (cargo do that for us) and subdir-objects because we have one Makefile.am and don't want recursive mode.

The m4_esyscmd macro is a shell command to extract the version out of the Cargo.toml.

VERSION=$(grep ^version Cargo.toml | awk '{print $3}' | tr -d '"' | tr -d "\n")

This does the same as above, but put it into VERSION

This shell command was adapted from Autoconf for Rust Project but fixed as it was being greedy and also captured the "version" strings from the dependencies.

AC_CHECK_PROG(CARGO, [cargo], [yes], [no])
AS_IF(test x$CARGO = xno,
    AC_MSG_ERROR([cargo is required])
AC_CHECK_PROG(RUSTC, [rustc], [yes], [no])
AS_IF(test x$RUSTC = xno,
    AC_MSG_ERROR([rustc is required])

Check for cargo and rustc. I'm pretty sure without rustc you don't have cargo, but better be safe than sorry. Note that this is considered a fatal error at configure time.

dnl Release build we do.

This is a trick: we need the cargo target directory. We hardcode to release as that's what we want to build.

The end is pretty much standard.

So far just a few tricks.


desktop_files = data/gpsami.desktop

desktopdir   = $(datadir)/applications
desktop_DATA = $(desktop_files)
ui_files = src/mgwindow.ui \

Just some basic declarations in the Makefile.am. The desktop file with installation target and the ui_files. Note that at the moment the ui files are not installed because we inline them in Rust.

EXTRA_DIST = Cargo.toml \
	src/devices.json  \
	src/devices.rs  \
	src/drivers.rs  \
	src/gpsbabel.rs  \
	src/main.rs \
	src/mgapplication.rs \
	src/utils.rs \
	$(ui_files) \
	$(desktop_in_files) \

We want to distribute the source files and the desktop files. This will get more complex when the crate grows as we'll need to add more files to here.

	cargo build --release

	-cargo clean

Drive build and clean targets with cargo.

	$(MKDIR_P) $(DESTDIR)$(bindir)
	$(INSTALL) -c -m 755 target/@CARGO_TARGET_DIR@/gpsami $(DESTDIR)$(bindir)

We have to install the binary by hand. That's one of the drawback of cargo.

We this, we do

$ autoreconf -si
$ ./configure
$ make
# make install

This build in release and install it in the prefix. You can even make dist, which is another of the reason why I wanted to do that.

Caveats: I know this will not work if we build in a different directory than the source directory. make distcheck fails for that reason.

I'm sure there are ways to improve this, and I will probably, but I wanted to give a recipe for something I wanted to do.

Wednesday 28 September 2016

Introducing gudev-rs

A couple of weeks ago, I released gudev-rs, Rust wrappers for gudev. The goal was to be able to receive events from udev into a Gtk application written in Rust. I had a need for it, so I made it and shared it.

It is mostly auto-generated using gir-rs from the gtk-rs project. The license is MIT.

Source code

If you have problems, suggestion, patches, please feel free to submit them.