Tag Archives: Ubuntu

Tech Board Nomination

So I’ve been nominated for the Ubuntu Technical Board. Modesty and a touch of impostor syndrome prevent me from strongly endorsing myself. Though if voted in, I would certainly work hard at it. Here’s a bit about where I sit in Ubuntuland for those who are voting, but don’t know me well:

I’ve been involved in Ubuntu for about 5 and half years. I’ve worked in Canonical’s OEM team, as well as the Foundation and Desktop teams and now the Unity team. I’m a core dev and a member of the MIR team (main inclusion reviews, not the Mir display server). I maintain the default Ubuntu backup program (Déjà Dup / duplicity) and the default Ubuntu login screen. Sort of a jack of all trades, master of none kinda guy.

But really, let me talk real quick about the other candidates and why they would be awesome choices.

Martin Pitt is ridiculously talented and a super nice guy; he knows a little bit of everything; he even has his own fan club, for goodness sake. Loïc Minier is equally helpful and experienced, and is nowadays doing great things on the Touch side of things. Adam Conrad has a great eye for detail; I work with him often doing MIR work. Steve Langasek does great work on the Foundations team and always knows where the Ubuntu bodies are buried. Marc Deslauriers is super smart, appropriately cautious, and has a pleasingly big picture view. Kees Cook, Clint Byrum, Stéphane Graber, and Benjamin Drung have all been very helpful and nice when I’ve interacted with them, but I don’t happen to know them personally enough to be able to write a little blurb.

Point is, all of those nominees rock and would be great tech board members.

Universal Emulator Frontend in Ubuntu 12.04

I wanted to set up a system hooked up to my TV that let me play NES or SNES games from the comfort of my couch. It was an interesting project, and I wanted to share my findings.

Setup

I have a spare laptop with an NVIDIA card running Ubuntu 12.04. If you also have an NVIDIA card, I highly recommend using the latest experimental NVIDIA drivers. They really increased the performance of and reduced the heat from my laptop.

Gamepads

I ordered two Logitech wireless F710 gamepads. They have a tiny USB dongle that they talk to wirelessly. They work great out of the box, but note that they must be on different USB socket groups. I first tried plugging them into USB sockets right next to each other and one of the gamepads didn’t work. When I put the USB dongles on different sides of the laptop, both gamepads worked again. ::shrug::

I recommend putting a sticker on gamepad 1 so you know which one it is.

Frontend

I installed XBMC then used it to download an add-on for its Programs section called “ROM Collection Browser”.

Using the ROM add-on, you can scan your ROM collections for each emulator. Be prepared for it to take a long time to download screenshots and covers if you have a lot of ROMs. The best feature is the ability to mark ROMs as “favorites” so if you have a huge collection, you don’t have to browse through all the crap each time.

XBMC lets you change the navigation bindings so you can use your gamepad.

NES Emulator

I’m used to the fceu family of emulators (gfceu, fceux, etc). But they did not support binding the direction buttons on my F710 gamepad. Those buttons send “hat” presses instead of simply button presses.

Looking further, I found an NES emulator I had never heard of. Mednafen not only can handle the “hat” presses on my gamepad but can also emulate GameBoy and a few other systems.

Note that the man page shipped with it is not helpful. You’ll need to browse the online documentation.

Press “ALT+SHIFT+1″ to set up bindings for gamepad 1 and “ALT+SHIFT+2″ to set up bindings for gamepad 2.

Press “F2″ to set up a binding to exit the emulator. This is an important theme! Once XBMC launches an emulator, you need a way to quit it with just the gamepad. When it closes, XBMC comes back. But since you don’t want to use an easy-to-accidentally-hit key or a key that a game is likely to use, you have to be careful. Thankfully, the F710 gamepad has a middle button that normally just turns it on. But once the gamepad is on, the button also sends a normal key press. And no game would need to use this special middle button. So make sure to bind the middle power button to the exit command of mednafen.

Also pass “-fs 1″ at least once to turn on fullscreen mode. The option is saved, so you only need to give it once.

When you add your NES collection in XBMC, note that the path to the mednafen command is “/usr/games/mednafen“.

SNES Emulator

I prefer the zsnes emulator for SNES games.

It doesn’t have any weird gotchas. Press “Esc” to bring up its main menu. Use the “Input” menu to set up the gamepads. Use the “Misc” menu to assign the exit button.

And don’t forget to enable full screen.

When you add your SNES collection in XBMC, the path to the zsnes command is the expected “/usr/bin/zsnes”.

Arcade Emulator

I found that the mame emulator works great for arcade games.

Press “Tab” to bring up its main menu. Under “Input (general)”, you can find the close command that is currently bound to “Esc” and replace it with your middle gamepad button. I found that most games needed me to individually set up “Input (this game)” bindings.

When you add your arcade collection in XBMC, note that the path to the mame command is “/usr/games/mame“.

Using the TV

I did hit one weird problem using the TV. Both mednafen and zsnes, if fullscreen, would switch which monitor was turned on. To stop them from doing that, I had to manually set each emulator’s fullscreen resolution to the size of TV.

Tada!

Anyway, that’s “all” it takes. Now you have an awesome emulator station. You can also use the “Advanced Launcher” XBMC add-on to add launchers for Ubuntu games that work well with gamepads, like Jamestown.

To avoid using the mouse or keyboard at all, you can set your user to automatically log in and add XBMC to your startup applications.

Software Updater Changes in Ubuntu 13.04

Thanks to Matthew Paul Thomas’s specification and work by Dylan McCall and myself, there is a proposed patch to change the Software Updater’s details panel.

I’m excited about the work and just wanted to highlight it here. See the difference for yourself:

Old ViewNew View

This hasn’t landed yet, but there is enough time before 13.04 that I’m confident it will make it.

A bit about what you’re seeing in that new view:

  • The “Backup” text and icon is being pulled from the .desktop file for the app.
  • “Backup” has subitems because there are some updates that only “Backup” depends on.
  • Updates that are not apps but are part of the base system (i.e. “on the CD but no .desktop file”) are under “Ubuntu base”.
  • For flavors, that should intelligently change to “Xubuntu base” or whatever and use the right metapackage.
  • Notice that the package name is no longer shown, so make sure your package descriptions are high quality!

Ubuntu Maintenance Work

I promised myself I would blog more about the work I do on Ubuntu, but I’ve not done anything glamorous enough recently to talk about. So instead I thought it might be interesting to give a top level view at some of the behind-the-scenes maintenance work that an Ubuntu developer like me gets up to.

Most Ubuntu developers will already know this stuff, but for those of you who are curious what kind of high-rolling life we lead, read on.

+1 Team

I recently came off a month rotation on Ubuntu’s +1 Team. This is a shifting group of people that try to keep all packages buildable and installable.

There are two very helpful web reports for this. The FTBFS report (fails to build from source) and the NBS report (not built from source).

When a package can’t be compiled from source (due to changes in libraries it uses, changes in compilers, incompatibility with less common architectures like ARM, etc.), it shows up on the FTBFS report. Typical cases are a missing build dependency or needing a newer version of a library.

The NBS report is for packages that we have to keep in the archive because some other package needs them, but we no longer have the source package in the archive to match it. For example, if libfoo1 becomes libfoo2, the foo source package no longer builds libfoo1, but we still need that package in the archive until all other packages use libfoo2. Usually, this is as simple as a rebuild of affected packages. When there are no more users of libfoo1, an archive admin can drop it from the archive.

This team always likes help, if you’re interested!

MIR Team

The MIR Team is the team responsible for approving packages that need to enter main, usually in order to be included on the Desktop or Server CDs. This additionally means that Canonical is on the hook for support of the package.

This review is largely focused on making sure that the package will be well maintained. That it doesn’t have a terrible history of security flaws. That it doesn’t needlessly duplicate existing main packages (for example, we’d like to have as few XML parsers that the security team has to look after as possible).

This is a relatively small team, since these reviews don’t need to happen often.

Can’t Log Into Precise? Here’s Help!

So apparently I helped break the login screen for a lot of non-English speakers in Ubuntu 12.04 recently. Sorry!

The basic symptom is that regardless of your system keyboard layout, you would get a “us” layout. Which likely meant you couldn’t enter your password correctly.

To work around this, change your keyboard layout to be correct in the upper right keyboard indicator in the login screen.

I’ve pushed a fixed version of unity-greeter to the archive (0.2.0-0ubuntu4). So once you do manage to log in and update your system, you shouldn’t hit this problem next login.

Vala Autotool Tricks

If you have a project that uses Vala and autotools, you are probably using automake 1.11′s native support for the language.

Overall, automake’s support is great. But there are a few oddities that I’ve worked around and thought I’d share here.

Declaring a preferred valac version

Let’s say your project expects valac-0.14. But sometimes people have more than one valac installed, and valac-0.14 may not be their default valac compiler. The AM_PROG_VALAC macro only looks for ‘valac’ in PATH, so you’ll end up with whatever default compiler the user has.

Instead, create an acinclude.m4 file in your project directory. Copy the AM_PROG_VALAC macro from /usr/share/aclocal-1.11/vala.m4 into it. Then:

  1. Change the name to something like MY_PROG_VALAC
  2. Change the AC_PATH_PROG line to something like:
    AC_PATH_PROGS([VALAC], [valac-0.14 valac], [])
  3. Change the AM_PROG_VALAC call in configure.ac to be MY_PROG_VALAC instead

Now configure will first look for the ‘valac-0.14′ executable and then fallback to plain old ‘valac’.

Stop shipping C code

For a long time, I shipped the valac-generated C code with my release tarballs because I did not want distributors to have to deal with valac churn. But nowadays, especially with valac’s six month release cycle, things are calmer.

Not shipping the C code makes tarballs smaller (over 20% in my case), makes it easier for distros to grab upstream patches (they don’t have to create a C patch from your Vala patch), lets you use Vala conditionals (more about that below), and means you’re not shipping something different than you yourself are using.

This only requires a simple change. In each source directory that contains Vala code, add the following bits to its Makefile.am (for the purposes of this example, your Makefile.am is creating the program foo):

  1. Separate out all your .vala code files into a separate variable called foo_VALASOURCES. If you want, you can then use this variable in your foo_SOURCES variable, so you don’t have to specify them twice.
  2. Add the following dist-hook:
    dist-hook:
    	cd $(distdir) && \
    	rm $(foo_VALASOURCES:.vala=.c) foo_vala.stamp

This will delete the generated C code and stamp file when creating a dist tarball.

Use Vala conditionals

One problem with shipping the generated C code is that you can’t use #ifdefs in your code. This is because Vala does not have a version of #ifdef that will “fall through” to the C code. Whether a conditional block of code gets used or not is always decided at valac compilation time.

And since your tarball consumers were compiling the C code, you couldn’t use Vala conditionals. To have an optional project dependency, you had to write little C wrapper functions that did use real C #ifdefs and then call them from Vala. Which is a pain.

But now that you’re shipping straight Vala code, all compilers of your program can now have Vala conditionals decided at compilation time.

Let’s say you have an optional dependency on libunity:

  1. In your configure.ac, find the block of code that deals with having successfully detected libunity and add the following line:
    AC_SUBST(UNITY_VALAFLAGS, ["--pkg unity -D HAVE_UNITY"])
  2. In your Makefile.am files, add $(UNITY_VALAFLAGS) to AM_VALAFLAGS
  3. In your Makefile.am files, add a new dependency to each of your stamp files:
    foo_vala.stamp: $(top_srcdir)/config.h
  4. In your Vala code itself, you can now surround your libunity-using code blocks with #ifs:
    #if HAVE_UNITY
    ...
    #endif

The first step declares the HAVE_UNITY symbol if we’re using libunity. If we’re not, that line won’t ever be reached and UNITY_VALAFLAGS will be empty. The only other interesting bit is step three, which ensures that when ./configure is re-run, valac will also be re-run to pick up any new symbol definitions.

Backups and Distro Upgrading

tl;dr; I don’t recommend using Déjà Dup to hold your data when you upgrade distros (e.g. from Ubuntu 11.04 to 11.10) without understanding the risks.

I’m the maintainer of the Déjà Dup backup tool that will be included by default in Ubuntu 11.10. So I’m generally biased in its favor. But I am also a cautious person.

My concern stems from the fact that Déjà Dup uses an opaque backup format [1]. Which means that it does not store your data in plain files that you can just copy back into place with the file manager. You’ll need to use Déjà Dup again to restore them [2]. Which is fine if Déjà Dup is working correctly, as it should.

But just from a risk management perspective, I always recommend that people try to have at least one copy of their data in “plain files” format at all times.

So if you back up with Déjà Dup, then wipe your disk to put Ubuntu 11.10 on there, you’re temporarily going down to zero “plain file” copies of your data. And if anything should go wrong with Déjà Dup, you’ll be very sad.

Here are a few recommended ways to upgrade:

  • Use the Ubuntu CD’s built-in upgrade support. It will leave your personal files alone, but upgrade the rest of the system.
  • Use the Update Manager to upgrade your machine. Again, this will leave all your personal files in place.
  • Copy your files to an external hard drive with your file manager and copy them back after install.

In my mind, a backup system’s primary use case is disaster recovery, where going down to zero “plain file” copies of your data is unintentional and an acceptable risk. Intentionally reducing yourself to zero copies seems unnecessary.

Hopefully, all this caution is overblown. I just want people to be aware of the risks.

[1] Déjà Dup uses an opaque format to support a feature set that just can’t be done with plain files:

  • Encryption
  • Compression
  • Incremental backups
  • Assuming little about the backup location (allowing cloud backups)
  • Supporting file permissions, even when backing up to locations that don’t

[2] There are technically ways to recover without going through the Déjà Dup interface. They just aren’t very user-friendly.

Why I Ubuntu

I thought I’d share my source of motivation for working on Ubuntu in the hopes that it might inspire others too! (Note that this is my personal opinion; I don’t speak for Canonical in this blog.)

I believe that the world is moving in the direction of widespread technical savvy and ubiquitous technical companions (think smartphones or tablets).

I want such technology to be a force of empowerment, and thereby a force of good for humanity. To me, empowering means cheap, trustable, adaptable, and easy to use.

Open source software trivially fulfills the first three, in ways that proprietary software cannot. And Ubuntu is doing it’s best to fulfill the last.

Cheap

Yes, technically Open Source doesn’t have to be free. But in practice, it is. Proprietary software can be too, usually through ads. But Open Source always is and, more importantly, always will be.

Trustable

With Open Source, it’s easy to have complete confidence even without being an engineer. As a consumer, Open Source means it won’t spy on me and will do only what it says on the tin.

Adaptable

Proprietary software can be adaptable, no question. Not all adaptations require source access, though it does help.

But importantly, Open Source reduces the opportunity cost for creating or even using software. You can stand on the shoulders of giants. Anyone can provide support or contract work.

There is no barrier to entry to modify Open Source software. And most such software is designed specifically to interoperate with other software, so it’s easier to mix and match.

By way of example, think of a fictional school system in India that wants to create a customized version of Ubuntu. Very trivial to do, and they don’t need to seek anyone’s permission.

Ease of Use

This last point has generally eluded the open source community. But it’s the most important in my mind. What good is the most capable technology in the world, if it never improves lives because it is so hard to use?

My goal is empowering, in a utilitarian sense, as many users as possible. I rarely write code for me alone.

This, more than the rest, is why I support Ubuntu. Ubuntu specifically focuses on users and expanding Open Source beyond the chasm. And more than that, Ubuntu has the best chance of all the current efforts to cross the chasm.

Conclusion

When I think of empowering, I don’t tend to dwell on the modern first world. They don’t especially need empowerment. I’m thinking of the less-franchised or even our own sci-fi future, when our relationship with technology becomes even more important. Do you think Geordi would run code on the Enterprise for which he doesn’t have source access?

Also note that this is not a moral argument; I don’t especially consider Open Source a moral directive for these purposes. Users won’t flock to us because Ubuntu is open source, but rather because Ubuntu delights them.

I understand why people work on splinter efforts or other projects, but for me, I think the work that Canonical does with pre-installs, enterprise support, for-purchase apps, Ubuntu One, and user testing is an invaluable addition to the main Ubuntu project. These are how we reach new users.

OnlyShowIn Unity

tl;dr; version: Add Unity to your OnlyShowIn keys.

This is just a PSA that I’ve been doing work in Ubuntu to support Unity’s addition to the set of registered XDG environments.

Which means that in your desktop file, you can now say OnlyShowIn=Unity; to have an application only show up or only autostart in Unity instead of KDE, GNOME, etc. (Or alternatively, use NotShowIn=Unity;)

By and large, Unity uses the same apps and services that GNOME does. So if you currently have an application that does OnlyShowIn=GNOME;, there’s a very good chance it should now be OnlyShowIn=GNOME;Unity;. You might also want to re-evaluate whether you even need OnlyShowIn.

I’ve filed a bunch of bugs against various GNOME apps and patched them in Ubuntu. But I’ve only focused on apps that Ubuntu installs by default.