Loud ramblings of a Software Artisan

Saturday 11 July 2020

Introducing Minuit

It all started with a sketch on paper (beware of my poor penmanship).

Sketch of a UI

I was thinking how I can build a musical instrument in software. Nothing new here actually, there are even applications like VMPK that fit the bill. But this is also about learning as I have started to be interested in music software.

This is how Minuit is born. The name come from a mix of a play on word where, in French, Minuit is midnight and MIDI is midday. MIDI of course is the acronym Musical Instrument Digital Interface, which is the technology at the heart of computer aided music. Also minuit sounds a bit like minuet which is a dance of social origin.

I have several goals here:

  1. learn about MIDI: The application can be controlled using MIDI.
  2. learn about audio: Of course you have to output audio, so now is the best time to learn about it.
  3. learn about music synthesis: for now I use existing software to do so. The first instrument was ripped off Qwertone, to produce a simple tone, and the second is just Rhodes toy piano using soundfonts.
  4. learn about audio plugins: the best way to bring new instruments is to use existing plugins, and there is a large selection of them that are libre.

Of course, I will use Rust for that, and that means I will have to deal with the gaps found in the Rust ecosystem, notably by interfacing libraries that are meant to be used from C or C++.

I also had to create a custom widget for the piano input, and for this I basically rewrote in Rust a widget I found in libgtkmusic that were written in Vala. That allowed me to be refine my tutorial on subclassing Gtk widgets in rust.

To add to the learning, I decided to do everything in Builder instead of my usual Emacs + Terminal combo, and I have to say it is awesome! It is good to start using tools you are not used to (ch, ch, changes!).

In end the first working version look like this:

Basic UI for Minuit showing the piano widget and instrument selection

But I didn't stop here. After some weeks off doing other things, I resumed. I was wondering, since I have a soundfont player, if I could turn this into a generic soundfont player. So I got to this:

Phase 1 of UI change for fluidlite

Then iterated on the idea, learning about banks and preset for soundfont, which is straight off MIDI, and made the UI look like this:

Phase 2 of the UI change for fluidlite

Non configurable instruments have a placeholder:

UI for generic instrument placeholder

There is no release yet, but it is getting close to the MVP. I am still missing an icon.

My focus forward will be:

  1. focus on user experience. Make this a versatile musical instrument, for which the technology is a mean not an end.
  2. focus on versatility by bringing more tones. I'd like to avoid the free form plugins but rather integrated plugins into the app, however leveraging that plugin architecture whenever possible.
  3. explore other ideas in the musical creativity.

The repository is hosted on GNOME gitlab.

Monday 22 June 2020

Flatpak extensions

Linux Audio

When I started packaging music applications in Flatpak, I was confronted to the issue of audio plugins: lot of music software has effects, software instruments and others implemented as plugins colloquially called VST. When packaging Ardour, it became clear that supported these plugins where a necessity, as Ardour includes very little in term of instruments and effects.

On Linux there are 5 different formats: while LADSPA, DSSI and LV2 are open, VST2 and VST3 are proprietary standard, created by Steinberg, they are very popular in non-libre desktops and applications. Fortunately with somewhat open implementations available, they exist on Linux in open source form. These 5 audio plugins format work in whichever applications can host them. In general LV2 is preferred.

Now the problem is that "one doesn't simply drop a binary in a flatpak". I'm sure there are some tricks to install them, since a lot of plugins are standalone, but in general it's not sanctioned. So I came up with a proposal and an implementation to support and build Linux Audio plugins in Flatpak. I'll skip the details for now, as I'm working on a comprehensive guide, but the result is that several audio applications now support plugins in flatpak, and a good number of plugins are available on Flathub.

The music applications that support plugins in Flatpak are Muse3 (not to be confused with MuseScore), LMMS, Hydrogen, Ardour (this is not supported by the upstream project), Mixxx, gsequencer. Audacity is still pending. There is also a few video editors: kdenlive, Flowblade, and Shotcut that support LADSPA audio effects and can now use the packaged plugins.

Sadly there don't seem to be a way to find the plugins on Flathub web site, nor in GNOME Software (as found in Fedora). So to find available plugins, you have to use the command line:

$ flatpak search LinuxAudio

GIMP

In the same idea, GIMP was lacking plugins. GMic has been the most requested plugin. So I took similar steps and submitted plugin support for GIMP as a flatpak. This was much less complicated as we don't have the problem of multiple apps. Then I packaged a few plugins, GMic being the most complex (we need to build Qt5). Now GIMP users have a few important third-party plugins available, including LensFun, Resynthesizer, Liquid Rescale and BIMP.

Thanks to all the flathub contributors and reviewers for the feedback, to the Ardour team for creating such an amazing application, and to Jehan for his patience as the GIMP flatpak maintainer.

Wednesday 26 February 2020

GPSAmi 0.1.0

A while ago I introduced GPSAmi. That was quite a while ago. Over last week-end I did its first release: 0.1.0. (I got delayed posting this, sorry)

Things that have changed:

  • Rearchitectured the UI code
  • Use meson as a build wrapper, based on GNOME Builder project template.

Known issues:

  • It might only build on my machine
  • It depends too hard on udev, which is a problem with Flatpak.

Don't hesitate to file an issue.

Release Info - Download source

Friday 11 October 2019

Getting a stack trace out of a Flatpak

So, the flatpak application you use just crashed

How do you report it? If you file a bug just saying it crashed, the developers will probably ask for some stack trace. On Fedora 30, for example, abrt (the crash reporting system) doesn't provide any useful information. Let's see if we can extract that information.

We are gonna have to use the terminal to use some command line tools. Flatpak has a tool flatpak-coredumpctl to use the core dump in the flatpak sandbox. The core dump is an image of the program memory when it crashed that will contain a lot about the crash. But by default the tool will not be able to provide much useful info. There is some initial setup need to be able to have a better output.

First you must make sure that you have the right Debug package for the right version of the Flatpak runtime. Well, actually, for the corresponding SDK.

To check the runtime and version:

$ flatpak info org.gnome.MyApp

Check the line that starts with Sdk:

Sdk: org.gnome.Sdk/x86_64/3.34

What is after Sdk: could be a different value, and that's what is important here. This is what we are looking for

You can use the command flatpak install --user org.gnome.Sdk.Debug (in our example, just put the one found above) to install it. WARNING: it's likely big. For example the debug package for Gnome 3.34 is 3.4GB here. I recommend installing it for the user as unless you have a lot of space on your system filesystem (if separate) it will fail.

Then you need to install the debug info for the app. It is the app ID suffixed with .Debug. In that case org.gnome.MyApp.Debug.

Both will provide the debug info that is necessary to see where in the code things crashed.

$ flatpak-coredumpctl org.gnome.MyApp

This is gdb being launched inside the flatpak. It will take a while to process, and use quite a good chunk of memory. What follows is mostly for people that are not familiar with `gdb` ; if you are, just go ahead, you know what to do.

When it is ready, something like this will be printed in the terminal:

Program terminated with signal SIGSEGV, Segmentation fault.

And then at the end, the prompt: (gdb)

You can type where and press "Enter". This is the stack trace: lines starting with #NNN where NNN is an increasing number. You can copy that output to provide it to the developers in the bug report.

Then you can type quit followed by "Enter" when you are done.

Hope that's useful to you.

Friday 1 February 2019

Music, Flathub and Qt

I have started reading recently about music theory and such, with the purpose to try to learn music (again). This lead me to look at music software, and what we have on Linux.

I found a tutorial by Ted Felix on Linux and MIDI

I quickly realised that trying these apps on my Dell XPS 13 was really an adventure, mostly because of HiDPI (the high DPI screen that the PS 13 has). Lot of the applications found on Fedora, by default, don't support high DPI and a thus quasi impossible to use out of the box. Some of it is fixable easily, some of it with a bit more effort and some, we need to try harder.

Almost all the apps I have tried used Qt. With Qt5 the fix is easy, albeit not necessarily user friendly. Just set the QT_AUTO_SCREEN_SCALE_FACTOR environment variable to 1 as specified in Qt HiDPI support documentation. There is also an API to set the attribute on the QCoreApplication object. There must be a good reason why this opt-in and not opt-out.

This is the case of Muse3 (aka muse sequence not to be confused with MuseScore which work out of the box, at east from Flathub), and Rosegarden.

LMMS and Hydrogen on the other hand are using Qt4 on Fedora 29. The good news? They both have been ported to Qt5, so it is just a matter of building these newer versions. After doing so, they still need the workaround described above.

This is where Flathub comes into play: make them available on Flathub, where we can set that environment variable for the runtime.

In the end, I have Hydrogen available on Flathub, the three others in queue for Flathub, and all have had patches submitted (with Muse3 and Rosegarden already merged upstream).

Now what other cool Free Software music oriented application exist?

Monday 10 December 2018

Microsoft vs the web

I have been saying for a few years now that Chrome is the new IE, and the Google is the new Microsoft (Microsoft being the new IBM). This statement have been somewhat tongue in cheek, but I have always been serious about it not being a joke: history is repeating. I could got at length on all the reasons why I believe this to be true, but I’ll just talk about one new development.

Last week, Microsoft announced that they had decided to abandon EdgeHTML, their web browser engine, and move to be using Google’s Chromium as the heart of the web browser offering, Edge. [1] Whether it will be just Blink and V8 (Web rendering and JS engine respectively) or also parts of Chromium is something unclear.

The takeaway from their statement is:

  • Because web developers seem to only care about Chrome, Microsoft believe in the short term gain of using Chrome for web compatibility since EdgeHTML is lagging behind. They view web compatibility as a single web runtime, not as better and diverse standard implementation:


“1. We will move to a Chromium-compatible web platform for Microsoft Edge on the desktop. Our intent is to align the Microsoft Edge web platform simultaneously (a) with web standards and (b) with other Chromium-based browsers. This will deliver improved compatibility for everyone and create a simpler test-matrix for web developers.” [2]

  • They view their own EdgeHTML code base unportable even on their own operating system, too tightly bundled:

“We will evolve the Microsoft Edge app architecture, enabling distribution to all supported versions of Windows including Windows 7 and Windows 8, as well as Windows 10. We will also bring Microsoft Edge to other desktop platforms, such as macOS”[3]

  • EdgeHTML being too tightly coupled into Windows, baking it into Windows 10 updates, this is the self inflicted wound that prevent improvement of just that component or even security issues without updating the whole OS:


“If every Edge user were using the very latest version of Edge, it wouldn't be so bad, but that's not the case, and that's because of how Microsoft has bundled Edge with Windows 10. Most home users will end up running the latest feature update to Windows 10 within a few months of its release. But enterprise users are more diverse. … This means that Edge, already a relatively small target for Web developers to think about, suffers major version fragmentation. Contrast this with Chrome, where within a few days of a new version coming out, almost the entire user base is migrated.” [4]

  • This will bring somewhat better parity with Edge on mobile platforms as the Android version of Edge is based on Chromium. (iOS remain the exception, but I’ll leave that for another day)

Microsoft recognized that they failed at reconquering the web.

One thing is clear is that Microsoft will contribute (or try) to Chromium, Blink and V8 to make these better for them. Remember Blink is a fork of WebKit because Google couldn’t work with WebKit major sponsor, Apple, so this may not be a done deal.

The other clear thing is the little marketshare Edge took away from Chrome as an alternative implementation will be aggregated into Chrome’s. Microsoft is actually helping the hegemony of Google, their competitor in several other market, like Bing, Hotmail, Azur, into controlling the web browser space, losing any leverage for web standards.

I wish they had gone the Mozilla route. Not as easy as the one they chose, but still probably way easier as their current situation, and helping Mozilla is helping the web stay relevant as an open standard.

Mozilla’s mission has become even more important than ever and if you wanted to do something useful for the future of the web, just use Firefox, and ensure, if you are a developer, that everything runs smoothly with it.

Ferdy Christant state of the web browser is a relevant read into the whole situation. So is part 2

Thursday 9 August 2018

AbiWord: It's alive

No, AbiWord is not dead. The project has been very very slow moving due to very little contributions. But sometime a spark ignite things.

Martin needed to fix a major issue we had related to how AbiWord uses Gtk wrong

Then I took some time to perform two long overdue tasks:

1. Moving to git. We have had a complete mirror of the Subversion repository for a few years on Github. But now that GNOME moved to their own Gitlab, I felt that this was a more appropriate place for AbiWord. So we did. Thanks to Carlos Soriano, AbiWord is now hosted in GNOME World.

2. Making AbiWord available as a Flatpak. This would alleviate all the issues plagging us wher distribution never update the app, even on the stable branch. For a while I had a manifest to build it, all it needed were a few adjustments and reaching the hosting stage.. Thanks to the Flathub team, AbiWord 3.0.2+ is now available on Flathub. Check it out.

The next steps:

1. Release 3.0.3 (the current stable branch) and update the Flatpak.

2. Release 3.next (I think it will be 3.2) as the next stable branch. This one includes a large set of fixes that should fix lot of crash and memory corruptions.

3. Work on more incremental release. With Flatpak this will be easier.

4. Gtk4 port: we might hit some roadblock with that.

Thursday 3 May 2018

libopenraw 0.1.3

I just released libopenraw 0.1.3. It has been a year since the last release. It mostly add new cameras.

I already have a branch for a new feature that will introduce Rust, for the next release.

Friday 20 October 2017

Lightroom exporter/importer

Importing from legacy proprietary software is always important to liberate one's data and help migrate to libre solutions. To that effect, I wrote, in Rust, lrcat-extractor. A library (crate in Rust parlance) with a command line tool to extract and dump Lightroom catalogs, the central part of the user generated data.

Currently only the 2-version-ago Lightroom 4 catalogs are supported as it is the only version I have. Supporting Lightroom 6 is just a matter of me purchasing it and spending some time.

The goal is of course to use this crate in Niepce to import Lightroom 4 catalogs directly.

Head to the lrcat-extractor repository if you are curious.

Also as aside note, I also have the equivalent for the now discontinued Apple Aperture that I started to write almost two years ago as a practical exercise to learn Rust, and that I used as a base. Check out aplib-extractor.

The next step is to implement the actual import, probably filling the gaps in the API.

Update: and before someone says it, the timing of this project is quite coincidental.

Thursday 19 October 2017

Porting to Rust

I have started something crazy a few weeks ago: porting my app from C++ to Rust.

I'll talk about it here, in several parts.

This is Part 1.

Context

Once upon a time I started to write a Free Software photography application for GNOME. That was around 10 years ago.

At the time I chose to use C++ and Gtkmm 2.x. C++ because it provide unprecedented interoperability with existing C code while offering a stronger type system, RAII and template which allow safer programing pattern without a large overhead at runtime. Absolutely no regrets on that.

Over the years, despite never having been released, the code base was updated to C++11 and Gtkm 3.x.

Meanwhile Rust was developed at Mozilla. And Rust provide safety and high performance, but doesn't have the same ease of interoperability with a C codebase - albeit still make it easy.

Why

Why am I porting to Rust?

  • Rust is designed to be a safe language. In this day an age of 0day due to the use of C (and C++) and their unsafe practice, it is something important.
  • I want the application to be multi-threaded (it already is in design), which is one of the design goal of Rust.
  • I started writing other stand alone components in Rust.

The goal is to progressively rewrite the application in Rust, mostly translating the logic from C++ to Rust. It is not the first time I do that across languages. I'll go from the back to the front, the front being the UI written with Gtkmm.

Several steps.

Build system

There have already been a few posts about integrating Rust into an automake based build system, by Federico or by me. The gist of it is to make sure you build your library with Rust and link it to your application.

Building Rust code is trivial using cargo. There is no point using anything else.

Interfaces

When doing an ad-hoc conversion, the biggest problem is calling code write in one language from the other.

You can not call Rust methods or generic functions from C/C++, so you'll need to have some interface in place. Rust has made it relatively easy by using C calling convention and making easy to declare a function with mangling disabled.

Example:

#[no_mangle]
pub extern fn my_function(arg1: &Type1) -> RetType {

}

From C/C++ this would be something like:

extern "C" RetType my_function(const Type1* arg1);

Here we assume that RetType is a POD (Plain Old Datatype, like int, char, float), not a C++ class, nor a struct in Rust.

There is a tool called cbindgen that will generate C/C++ header from the Rust code. You can call it from the Rust build system to do that automatically. More on that later.

Calling C code from Rust is easy provided you have the proper interfaces. It is basically the reverse of the above.

But calling C++ methods on the other hand, you have to use bindgen.

Bindgen

Bindgen will generate from C/C++ header Rust modules to call C++ code.

The tool is pretty amazing to not say magical. Albeit there are some pitfalls.

For example a C++ class with a std::vector<> as a member will a problem if treated as opaque type. To be honest I am not sure if it would work as a non opaque type.

The use of boost might actually generate Rust code that doesn't compile.

Running the Rust test with cargo test will fail. I think it is Issue 1018.

There are a few things I recommend. Most notably, treat types as opaque whenever possible ; whitelist types and functions rather than blacklist. Your mileage may vary but this is what I got to work in my case.

Using these tools I managed to actually rewrite the large parts of the backend in Rust with equal functionality. Progressively more C++ code will be replaced.

I'll get into detail in a future post.

Wednesday 9 August 2017

Status update, August 2017

Work:

In March I joined the team at eyeo GmbH, the company behind Adblock Plus, as a core developer. Among other things I'm improving the filtering capabilities.

While they are based in Cologne, Germany, I'm still working remotely from Montréal.

It is great to help making the web more user centric.

Personal project:

I started working again on Niepce, currently implementing the file import. I also started to rewrite the back-end in Rust. The long term is to move completely to Rust, this will happen in parallel with feature implementation.

This and other satellite projects are part of my great plan I have for digital photography on Linux with GNOME.

'til next time.

Friday 10 March 2017

So, I got a Dell

Long. Overdue. Upgrade.

I bought a Dell XPS13 as my new portable workstation for Linux and GNOME. This is the model 9360 that is currently available as in a Developer Edition with Ubuntu 16.04 LTS (project Sputnik for those who follow). It satisifies all I was looking for in a laptop: lightweigh, small 13", 16 GB of RAM (at least), core i7 CPU (this is a Kaby Lake) and must run Linux well.

But I didn't buy the Developer Edition. Price-wise the Developer Edition is CAD$150 less than the Windows version in the "business" line and CAD$50 less than the Windows version in the "home" line (which only has that version). Exact same hardware. Last week, it was on sale, CAD$500 off the regular price, so it didn't matter. I got one. I had delayed so long for getting one, this was the opportunity for a bargain. I double checked, and unlike the previous Skylake based model that didn't have the same wifi card, this one is really the same thing.

I got a surprise door bell ring, from the delivery person (the website didn't tell me it was en route).

Unboxing

The black box

It came in a box that contain a cardboard wrap and a nice black box. The cardboard wrap contain the power brick and the AC cord. I'm genuinely surprised of the power adapter. It is small, smaller than what I'm used to. It is just odd that it doesn't come in the same box as the laptop (not a separate shipping box, just that it is boxes in a box, shipped to you at once). The nice black box only bear the Dell logo and contain the laptop, and two small booklets. One is for the warranty, the other is a getting started guide. Interestingly it mentions Ubuntu as well, which lead me to think that it is same for both preloads. This doesn't really matter in the end but it just show the level of refinement for a high-end laptop, which until the last Apple refresh, was still more expensive than the Apple equivalent.

New laptop...

Fiddling with the BIOS

It took me more time to download the Fedora live ISO and flash it to the USB stick than the actual setup of the machine, minus some fiddling. As I had booted, Fedora 25 was installed in 20 minutes. I did wipe the internal SSD, BTW.

Once I figured out it was F2 I had to press to get the BIOS upon boot, to set it to boot off the USB drive, I also had to configure the disk controller to AHCI instead of RAID, as without that the Fedora installed didn't find the internal storage. Note that this might be more troublesome if you want dual boot, but I didn't want. also I don't know what's the default setup in the Developer Edition either.

Dell XPS13 BIOS configuration

Nonetheless, the Fedora installer worked well with mostly defaults. HiDPI is managed, which that I finally have a laptop with "Retina Display".

Hands on

The laptop is small, with a HiDPI screen, a decent trackpad that works out of the box with two finger scrolling. An aluminium shell with a non glowing logo in he middle, a black inside with keyboard, and glass touch screen. The backlit keyboard has a row of function keys at the top, like it should be, row that dub as "media" button with the Fn key or actually without. Much like on a MacBook. The only difference that will require me to get used to is the Control key is actually in the corner. Like it used to be... (not sure if that's on all Dell though, but I remember hating not have control in the corner, then got used to to it like there was no option, and that was more than a decade ago). That will make for a finger reeducation, that's sure. The whole laptop a good size, it is very compact. Light but solid.

As for the connectivity, there is a SD card reader, 2 USB 3.0 port (A type) and a USB 3.1 Type-C that carries HDMI and Thunderbolt. For HDMI looks like a CAD$30 adapter, but it seems to be standard so might not be a huge problem.

Conclusion

I am happy with it. GNOME is beautiful in HiDPI, and it is all smooth.

Saturday 4 March 2017

Link: The Story of Firefox OS

Just a plain linkage to The Story of Firefox OS by Ben Francis. Ben retell the story so much better than I well ever be able to.

I’m incredibly proud of what we achieved with Firefox OS. If you measure its success against the original stated goals I think the project went way beyond its initial objectives. We went beyond a technical prototype to successfully ship 20 different commercial devices in over 30 countries, including smartphones and smart TVs. Apart from the incredible feats of engineering, prototyping 30 new APIs for the web platform and building possibly the most complex JavaScript codebase ever created, we built a mobile operating system from scratch and got it to market in less than two years. As far as I know no other team has ever done this.

I worked on Firefox OS for several years as well (I arrived on the project some time after Ben did) until mostly its end. I noticed there was a lot of misunderstanding in what the goal was, and a lot of questions. To me it was a great effort that tried to disrupt the market by opening the silos of mobile applications, using the web, trying to become the third mobile OS. A project a lot of people didn't think fit in Mozilla's mission. Its conclusion was much personal sadness.

Also remember, unlike Android, Firefox OS was developed in the true open source way: in the open, with community participation, and not behind closed doors with a code drop at each release, stripped down of features.

Saturday 17 December 2016

So, I'm looking for a job

I decided a while ago to avoid personal posts here with my loud ramblings, because they en up being pointless. I'll make an exception now and bring in a little life issue that is a little bit sudden.

I'm looking for a job, in snowy Montréal (I can do remote too).

If you are looking for an experienced developer to join your team, I can be that person. Even for a limited time, I'm ready to start as soon as you are.

My résumé

And may you have happy holidays.

Tuesday 13 December 2016

Rust and Automake - Part 2

I already talked about Rust and Automake. This is a follow-up.

One of the caveat of the previous implementation is that it doesn't pass make distcheck as it doesn't work when srcdir != builddir.

The main problem is that we need to tell cargo to build in a different directory. Fortunately Cargo PR #1657 has taken care of this. So all we need is a carefully crafted CARGO_TARGET_DIR environment variable to set before calling cargo. You can substitute abs_top_builddir for the top build directory.

But let's show the patch. Nothing complicated.