Articles tagged with: Virtualization

Emulators for Ubuntu Linux

I love videogames and I grow up with what now is called retrogaming. I also switched from Windows to Linux a while ago and, despite having a dedicated gaming desktop PC, it is mainly for recent titles. Taking advantage of some holidays I decided to setup some emulators at the laptop. It wasn't always easy so I decided to write this post, both for myself in case I need to repeat it and just in case it is of use to anybody else.

Shadow of the Beast - Commodore AMIGA

Note: At the time of writing this blog post, my system is an Ubuntu 16.04 x64. Based on my experience, Linux software is very sensitive to operating system versions (way more than Windows), so I can't guarantee that everything will run for example at Ubuntu 17.04.

For Arcade machines I use MAME, I use the oficial Ubuntu Software Center MAME Arcade build/binaries plus the GNOME Video Arcade GUI (available at Ubuntu Software Center too). The main issue is that it is a barebones GUI, missing many many features from things like MAMEUI, so I also keep MAMEUI inside a virtualized Windows XP SP3.

Before continuing, a small intermission to explain the reasoning behind that Windows XP virtual machine. VirtualBox has come a long way regarding Virtualization, and even under Linux (where I haven't been able to make work 3D emulation) it works quite nicely and I use it mainly for three tasks that have to do with videogames (among other unrelated tasks):

  • Playing old Windows games that don't work with Windows 7 and don't require Direct3D, like Civilization II.
  • Access advanced VirtualBoyAdvance features like object and map memory visualizers (GameBoy development related). It works really well inside VirtualBox, and Wine wouldn't load it.
  • Launching MAMEUI to see game snapshots (screenshots). Initially only until I finish doing the cleanup but I actually want to try running a virtualized MAME32. Note that I haven't tried running it with Wine, so it might work. From a host SSD it boots up in less than 3 seconds and just has configured a few shared folders to not need to move things in & out. And yes, XP is really old, but precisely is that version the one chosen for conflicting games (with Vista and 7 more than a few old Windows games stopped working). Oh, and for USB 2.0/3.0 and other goodies support, install also the VirtualBox Extension Pack.

For old Nintendo systems (Gameboy, GameBoy Color, Gameboy Advance, NES/Famicon and SuperNintendo/SuperFamicon) I use Higan. Should at least be v103. The main reason (apart from typical better emulation and speed) is that GameBoyAdvance BIOS ROM loading was mostly broken under Linux and got fixed around version v100. One remark, to run GameBoy Advance ROMs, you need the BIOS ROM.

To play the SEGA MegaDrive/Genesis, I used DGen/SDL, but you need to compile it and it is command-line based, so in the end I tried and am happy with Gens.

My beloved Commodore AMIGA 500 still works nicely, but floppy disk loading times and the like are tiring, so I also play this great computer via emulation. Especifically, using FS-UAE + FS-UAE Launcher. A few notes/tips also here:

  • RTFM. There are many options and some "flows" are really clunky, like disk swapping (you need to "multi-select" all disks to be able to F12 and switch them, but the UI doesn't mentions this anywhere). The documentation is almost a mandatory read in this case
  • To run anything you need the desired AMIGA firmwares (e.g. I needed Amiga 500 one). In this case there are even official commercial compilations of AMIGA software which include them
  • To run Workbench tools you need AMIGA Workbench, and to run hard disk programs you need to first create an AMIGA hard drive (via the configuration)
  • Combining that the AMIGA was prone to give Guru meditation errors either with cheats or with obscure unrecoverable errors, plus other mysterious hangups that make my whole Ubuntu freeze to death once or twice while running in fullscreen, this is the only emulator that didn't felt 100% stable. Still, in general works nicely and I'm not entirely sure it is not due to my graphics card, as in windowed or borderless maximized window I had no such issues.

And finally, to play old MS-DOS games either not available at Good Old Games or that I already own, nothing beats the great DOSBox, which can for example be found at Ubuntu Software Center. This is a generic operating system virtualization so each game might need individual tweaks, but many work perfectly out of the box.

One thing that I haven't tried yet is Playstation/PSX emulation. PS2 is still not 100% emulated under Windows so I don't even care, but I have pending to check for Linux PSX emulators, there should be something decent already...

Once setup, all this software works nicely, but it is not an easy task (at least not without this post summarizing it ;). There is a great all-in-one solution that I tried, RetroArch. It is a multi-emulator GUI-software-thingy that supports plugins to run many many systems, from legacy ones to really recent stuff like Wii's Dolphin emu. The reason I wasn't convinced by it is that at least when I tried it a few months ago, the Linux build was unstable and only worked with some systems. Windows build looked way more robust (I tested it) but as wasn't my plan uninstalled. It is the base system used at the RetroPie distribution, so the distro is correctly setup and already contains many basic features I probably missed out, but it wasn't as trivial as I thought.

My laptop is an old 2012 Dell XPS but runs perfectly the systems mentioned above. I know a Raspberry Pi can even run now Neo-Geo games at a decent framerate so one day I'll get one, but my two main reasons for waiting are a) I wanted to do this learning experience before grabbing a quick'n easy solution, and b) I still want to wait a bit further until commodity hardware evolves and runs more powerful machines like the PSX or GBA without frame skipping (probably something like a Raspberry pi 4 will do).

Hope this list helped you out!


Having a good, disposable devbox

Around 2005 virtualization already was working nicely, and at work, as we did .NET consulting, we started using virtual machines (with Virtual PC) as our main development environment. We would have a base VM snapshot with a Visual Studio, and then when starting a project we'd just clone it and add specific requirements (e.g. SQL Server). It was pretty much manual but still a great improvement over having to clean or even format your host machine between projects. Also, migrating to new hardware was seamless, just copy the VM image and good to go.

Now, things have evolved a lot, and having switched to a mostly opensource stack and Linux development environment there's a wider array of options, plus greater automation capabilities. I've never been directly involved in the management of development tools resources or projects (except one or two small scripts) until recently, but I've tried to use whatever environments they provided, with varied results.

Keeping the "optimal" scenario of having everything on your machine (is the fastest and quickest in the short term but has lots of disadvantages too), I moved from manually managed local VMs to having remote dev machines, where you would rsync files, SSH when needed to restart a process, and usually deployed your code to another location (being web dev, mostly having a local dev with your kartones.localhost.lan and remote webserver like kartones.xxx.dev). This approach is not bad (as worked for me for quite a while and in multple jobs) but has two big disadvantages:

  • No connection means can't work: You have to setup one (or more) backup DSL lines with a different ISP at the office for outages (which at least in Spain are not so infrequent)
  • The noisy neighbor problem: If you share your machine with another 4 developers and your build process is CPU or IO heavy, or you must run some Hadoop map-reduces, you can easily eat all resources and impede other's work

Being few people the remote dev machines is a good approach, but as you grow it becomes a severe limitation.

So, how do we solve this issues? Well, thanks to Virtual Box, Vagrant and Puppet we can now easily have provisioned development virtual machines: Local but instrumentalized VMs that closely match a production server and whose configuration and installed packages are managed from the same tool that setups production machines, just requiring different config sections (but mostly being a copy + paste + rename task). I've lived three iterations of this approach at different jobs, from a quite manually (and badly working) version, to a "working but not smooth enough to replace a local dev env" and to my current job setup, which works so nice we now don't support anything excepting the devbox.

It took us weeks of iterating and forcing the whole tech team to install it by themselves, just following the README instructions and providing either feedback or directly commits with improvements, but feels worth it because:

  • We're all on the same enviroment: Problems "are shared" so quickly solved and easily reproducible. If something breaks, breaks for everyone so no broken windows effect
  • Process is dead easy to follow: I try to push every service and tool to have a README.md detailing instructions, but in this case the more we use it the easier it gets and more we improve and automate it
  • Fast and isolated: Not native-speed but the faster your hardware the faster it goes, and you never hurt other team members' speed with heavy scripts you run
  • No need to depend on external storage for a backup: In the past, I used to carry a USB Donge with a backup of the VM just in case the original died, had a fatal update (Ubuntu is great but I've more than once had some update break the VM at boot) or you just want to revert some undesired change
  • Almost the same environment as production: This ultimately depends on you, but the closer you get to replicating production the easier you triage configuration issues regarding web server, caching, connection pools...
  • Easily updateable: Linux kernel updates, provisioned software updates, individual repository dependency updates... Everything handled via single commands
  • Everybody participates: Have an idea to automate something? Code it and push it!
  • Helps keeping codebases homogeneous: Having templates for microservices and web-apps is handy, but having lots of services that have to be configured, launched, tested, etc. means you naturally try to setup conventions of folder and code struture, helper scripts, launchers/runners... Make easy doing things the right way and it will yield better results (or at least make hard to go wrong!).

We bet so hard on having this process quick, easy and painless that if I was allowed to, I'd setup the devboxes to self-destruct after 2 weeks of use, to force everybody to re-install them and be always sure that no matter what happens, you can reprovision and have a working dev environment in a few minutes. I manually do delete mine (including the code repositories) and you feel at peace and calm when you just do:

  1. Clone the operations repository
  2. vagrant provision
  3. vagrant ssh + run install.sh script providing your desired username

And this is just the beginning, now with containers with Docker & the like we're moving towards an "optimized" version where you can replicate something really like production, in your local machine, with disposable instances, always updated (and using the same mechanisms than production, to avoid nasty errors) and doing a much better resource usage. But I have not talked about them because we haven't yet migrated to containers, so I have much to learn and experiment before being in a position to give an opinion, I'm just eager to try it!


Two VirtualBox tricks

At the beggining of the year, I was told to try VirtualBox as a Virtualization solution for my personal needs (trying Linux, and having a few VMs for things like beta versions of software). After fighting with VirtualPC, and not being convinced with VMWare, I must admit VirtualBox is almost perfect: USB support, 3D acceleration (experimental and not for games, but its faster than software rendering anyway), almost every OS (whenever its a Linux or Windows)...

But sometimes is not as easy to use as the others, hiding actions that, at least in my opinion, should be clearly visible and easy to perform. The two most common ones are this two:

  • Change UID of a VM: If you create a base VM, and copy the .vdi file, VirtualBox will complain that the UID is already in use. To change it on the copy, open a command prompt and execute (the executable is located in the VirtualBox install folder):
    VBoxManage internalcommands setvdiuuid xxxxxxx.vdi
  • Mount a shared folder: To share folders between the host and guest operating systems, you have to use shared folders. But once you add one to VirtualBox, it doesn't appear, does it? Well, you have to mount it manually (under my PC), because it is a network share instead of a virtual drive:
    \\vboxsrv\xxxxxxxxxxxx

This are two actions you'll end up performing sooner or later, so I hope they prove useful!

P.S.: As an offtopic note, I'm going to PHP Conference Barcelona this weekend so if you're around and/or going just search for the Tuenti folks and I'll be among them :)


Installing Debian 5.0 in Virtual PC 2007

I've spent most of this morning installing the newly released Debian 5.0 under Virtual PC 2007 (SP1).

As usual with VPC, all non-Windows OS installs usually give headaches, but this one has been a real pain.

Just after creating a new VM and launching the install, I got this terrifying message: "An unrecoverable processor error has been encountered. The virtual machine will reset now". Trying the non-graphical install throws the same result.

After some digging on the net, I found that it is a problem of the latest linux kernel builds and Virtual PC, which can be solved by selecting "Graphical Install" but instead of pressing ENTER, pressing TAB and adding the following at the end of the command line:

noapic nolapic noreplace-paravirt

Also, modify "vga=788" parameter to "vga=791" to avoid strange resolutions.

Then you can install it as you would do with a physical/real install (a standard + web server install takes only about 2GB of space counting swap partition).

Once installed, this parameters will stay and you no longer will get the VPC error, but the problems are far from finished.

  • To enable sound, go to System –> Preferences –> Sound and change all to ALSA sound system
  • To enable internet, change your VM settings to use Shared Networking (NAT) and configure it under Linux as DHCP/automatic. I couldn't configure a standard ethernet card mapping without NAT.
  • To enable higher resolutions (at least standard ones), modify the /etc/X11/xorg.conf file (with nano for example) and replace the three sections "Device" "Monitor" and "Screen" with this new content:

Section "Device"
Identifier "Configured Video Device"
Driver "vesa"
BusID "PCI:0:8:0"
EndSection

Section "Monitor"
Identifier "Configured Monitor"
Option "DPMS"
HorizSync 30-70
VertRefresh 50-160
EndSection

Section "Screen"
Identifier "Default Screen"
Monitor "Configured Monitor"
DefaultDepth 16
SubSection "Display"
Depth 16
Modes "1280x1024" "1024x768" "800x600"
EndSubSection
EndSection

Save, restart Linux and finally you have a decent Debian 5 virtual machine! Mouse still gets hooked into the VM, but not being able to install the VM Additions you can live with it.


Windows Vista Pre-RC1 under MS Virtual PC 2004 SP1

Well, as I can't right now install the pre-RC1 of Windows Vista in any of my pcs, last weekend I decided to install it on a Virtual PC 2004 to see if it works... And it worked!

Windows Vista RC1 under Virtual PC 2004

The VM Additions work, at least the graphic driver (without them it was hell to move a single window!), and well... you can test Vista. No fancy 3D, and you should disable some things, but it works fast enough to try it's new security system, tools, options rearrangement and such.

I've installed it on a P4 HT 3,4 GHz, 1 GB DDR2 Dual-channel RAM (700 MB assigned to the VM), and a SATA HD (15 GB assigned), with almost no other application running.

I've disabled/deactivated the following to try to optimise for speed:

  • Cleartype font smoothing (vast performance improvement), and menu animations, fades, slides, and such (shadows and visual styles kept)
  • Windows Defender & Firewall (no change in performance)
  • No screensaver/power-saving
  • HD not indexed for fast searching (took a lot to "deactivate" indexing because of the almost 6 GB Vista takes once installed)
  • Disabled System Restore
  • Defragmented the HD

And now it goes quite good. It's not as fast as an XP (I've developed with an VS2003 and XP Virtual Machine on lower specs PCs) but I'll be able to play with the OS for a while :)

Note: I haven't configured internet/LAN yet, so I hope VMAdditions' ethernet virtual driver works. Emulated Sound Card didn't.