Sunday, December 22, 2013

SteamOS Beta: Installation and Test

Despite my stance on DRM (I abhor the entire concept for many, many reasons, but that's for another post), due to demands gentle encouragement from friends I decided to try out Steam (and specifically, SteamOS) and see what it's all about.

At this point SteamOS is in Beta, and while it's got a lot of areas to iron out, once you get it installed it works pretty well, including fully functional AMD GPU drivers despite no official support.

Initially I tried to run this on my old Core 2 Quad system, but the install failed because the BIOS wasn't new enough (needed to have UEFI Boot option). But on a newer machine the install process was much smoother. It was a fairly low-specced machine with the following parts: Intel G1610, 2GB RAM, AMD HD7770, 128GB Sandisk Ultra Plus SSD.

SteamOS desktop

Here are the steps:

  • I chose the SteamOSIntaller.zip option, mainly because I thought the CloneZilla-based image option wouldn't work because I didn't have a 1TB HDD, which was apparently required for the image.
  • Instead of following the installer instructions in the SteamOS FAQ, I converted it to an ISO using the following command:
  • grub-mkrescue -o SteamOS.iso steam/
  • Next I put the ISO on a flash drive (using Unetbootin or ISO writer of choice). I discovered that even with the ISO, UEFI boot is required.
  • Selected "Automated Install" from the boot list. Everything went smoothly for me from this point.
  • Initially I was just running with Intel onboard graphics. After install, I plugged in the HD7770. It picked up the fglrx_pci drivers and "just worked" which was nice.
  • The first time, you have to run "steam" from the console. It downloads and installs Steam (about 220MB). After installing and creating my account, it "crashed" (spat out a crash ID anyway). I restarted using the "Steam" icon on the desktop, which worked fine.

The first game I installed was Killing Floor. I was only running it at 1680x1050 (using an old monitor), but it looked great at High settings.

Killing Floor on SteamOS at 1450x900

Only games with a Linux target will work. Non-linux games appear in the Steam client window, but don't have an "Install" button. I assume that someone will work out how to get a wine option to run Windows only games via SteamOS, but I haven't tried this yet.

Things of interest to note:

  • The installer erases the target drive, and potentially any other drives plugged in. I installed on a fresh system to make sure no data was accidentally lost.
  • USB drives mount as read-only by default, and the "steam" user doesn't have permission to do even that. Makes it a pain trying to transfer screenshots or anything between computers.
  • The software sources are limited basically to Steam repos. I assume you can add other sources in, but haven't explored this.

Friday, December 13, 2013

GOG: If Only There Was Support For...

Update 27 Jul 2014:

GoG has announced support for Linux. Currently there are 50 or so games.

Original post (13 Dec 2013):

In the past I've bought quite a few games from awesome DRM-free games site Good Old Games. But it's been a while now, primarily because I haven't used Windows for a couple of years.

They send me regular emails like this one, asking me to come back:

And I keep saying "I'd love to buy more stuff, but none of the games will run on Linux".

I understand why they can't support the platform, so I'm not blaming them, but it makes me sad, because I'd love to support their efforts.

:(

Tuesday, November 12, 2013

Review: TP-Link TL-WN722N USB Wifi Adapter

Yesterday a friend from work lent me his TP-Link TL-WN722N USB wireless adapter to try out.

I plugged it into a machine running Linux Mint 15 (Cinnamon) 64-bit. At first everything seemed fine: it connected to the router flawlessly, and started getting data as fast as the internet could provide it.

Then I moved the machine into another room, to see how it would go through walls. It was about 8 metres away, going through two brick walls. It could no longer connect.

You mad bro? (Definitely didn't like being moved away from the router)

I gradually moved the machine closer, but found that I had to be in direct sight of the router and less than 1 metre away before it would connect up. I can't imagine this is normal, and must be a fault with the hardware. Bit unfortunate! If you've got one of these and get similar symptoms, I'd be returning it straight away.

Unfortunately I can't really do a proper review (even for me, I'm pretty rough after all) with these results. :(

Wednesday, November 6, 2013

Review: MSI B75MA-E33 and MSI DVD-RW

I hadn't used an MSI motherboard before, so I thought I'd give one a go. They apparently have a relatively high fault rate compared to competitors, so I don't see them recommended as often.

The B75MA-E33 is a budget board, I guess targeted at low cost builds. It was the cheapest B75 chipset board at my local parts store.

The PCB wasn't too thin or flimsy, no more so than other B75 boards I've used anyway. I paired it with a G1610 CPU, and it booted up first time with no problems. All ports were functional, so no issues at all. Didn't spend long in the BIOS, seemed okay, but the case didn't have a fan that could be controlled by the motherboard, so didn't get to really test any of their fan control features. I had to pop open the manual to find the placement of the case power, reset and LED pins, since there was no indication printed on the board itself.

It only has D-Sub and HDMI ports for onboard graphics, but if you've got a HDMI monitor it seems like a fairly solid budget board.

MSI DVD-RW

In the same build I grabbed an MSI DVD drive as well, again the cheapest on offer at $16.

It came in a box (unusual these days) with a set of screws and a replaceable faceplate for people with white cases -- cute.

I was pleasantly surprised with the noise, no louder than any other drive. About the only limitation is the white text on the faceplate, unlike, say, the Pioneer DVD drives, which is in black so it's not as obvious.

But overall another decent budget option.

Tuesday, November 5, 2013

Review: Shaw GT-GM1 Computer Case

After reading this review on OCAU forums, I was curious as to whether the Shaw cases were really a viable option for budget builds.

So I picked up the $33 Shaw GT-GM1 from MSY to give it a test (no PSU of course -- as all good children say in their prayers at night, please don't let Daddy use a Shaw power supply).

I must admit the sound of parts rattling around inside the box was a bad sign. Turned out to be a couple of 5.25" and 3.5" clips that hold drives in place had fallen out and needed to be clipped back together -- no big deal.

First impressions: the case appearance was far from pretty, but not too garish. The crinkly front doesn't look as bad in real life as it does to me in the photos. Top-mounted power supply position is getting rarer these days, but was common a few years back and seems the way ultra-cheap cases are still made. The back screws felt a little cross-threaded. The side panels were flimsy (as noted in the other review), pretty much as expected in a cheap case.

Notes on the rest of the setup:

  • One of the 3.5" plastic disk holder clips was missing.
  • Two missing expansion slot plates at the back.
  • Motherboard risers were pre-installed. Some risers were loose and couldn't be tightened -- cross-threaded out of the box. Reworking some of the other risers (there were more than would be needed for the mATX board I had) fixed the problem, but it didn't instill any confidence.
  • No case screws! Other than the motherboard risers, nothing at all.
  • Front panel cords were cable-tied in such a way they wouldn't reach the motherboard. Pretty difficult to cable-manage them in any neat way.
  • Case comes with a single fan at the back. The fan had to be powered directly from the PSU via a Molex plug, rather than a 3- or 4-pin plug into a motherboard header.
  • Noise levels aren't great. The case fan runs flat out due to its Molex plug.

I can't remember ever feeling so dubious about a case working properly, but on first boot everything seemed okay. USB, headphones, power/reset switches and fan all worked upon testing, frankly to my surprise at that point.

So my conclusion is a little different to the OCAU reviewer's: given the problems I found with this case, I couldn't recommend it, even taking into account the price. It not unlikely that someone looking to do a cheap-as-possible build may be doing their first or at least be inexperienced. The lack of case screws and dodgy motherboard risers would likely stymie them. I was okay because I had a heap of spare parts, but that isn't everybody. Anyone actually using the case should be prepared to either replace the stock fan or fit a voltage regulator (taking the price closer to the next-case-in-line anyway).

In my opinion, you're far better off paying an extra $15 for a fifty-dollar case. Even though they may not be fantastic either, I think it's money well-spent.

Saturday, October 26, 2013

Convert M4V to AVI Using ffmpeg

I tried numerous ffmpeg settings to convert some M4V (MP4) files into AVI. Most either didn't work or crashed with a segmentation fault.

The only one I found that worked was this answer at StackOverflow. However, the quality of the file produced using the settings in the answer wasn't great, so this is what I ended up using:

ffmpeg -i input.m4v -f avi -b 2048k -ab 160k -ar 44100 output.avi

The output files are about twice the size of the input file, unfortunately, but they look decent.

Sunday, October 20, 2013

AMD GPU on Linux, Revisited

It's been almost a year since I first tried to run an AMD graphics card on Linux. After some success with an AMD A-CPU build a few weeks back, I picked up a 2nd hand HD7770 yesterday to try out and see if things have improved.

There were a number of tests I was interested in doing. I didn't have a convenient dual-monitor setup available, so everything is on a single screen (dual screen was one of my major problems last time, so I might get around to this at some point). The tests were:

  • Dragging a window around, looking for tearing
  • Scrolling in a web browser, looking for tearing again
  • Playing a HD video
  • Playing a game

I installed the card on my old Intel Q6600 system, and ran off the onboard graphics to start with, to get a baseline. I installed a fresh Kubuntu 13.04 as the test platform. These are the results, discussion at the end:

Test 1: Intel i915 (onboard) Graphics
ActionResultComments
Drag windowXNoticeable glitches
Browser scrollXOccasional glitches
Movie:)
GameXUnplayable (took a few minutes and hadn't even made it through the intro to the title screen)

Test 2: Open Source (Radeon) Drivers
ActionResultComments
Drag windowXNoticeable glitches
Browser scrollXOccasional glitches, similar in appearance to the onboard performance
Movie:)
GameXGame ran, going from 20 FPS sometimes down to 5. Some major glitching, and eventually a black screen. Not playable.

Example of the in-game glitching with the open source drivers


Test 3: AMD Catalyst 13.4
ActionResultComments
Drag window:)Looks nice, but need to ensure "Tear Free" option is selected in AMD Control Centre
Browser scroll:)Again, need to ensure "Tear Free" option is selected
Movie:)And again
Game:DVery good results (eventually), no noticeable lag on full-screen.

I intended to use 13.10, but the current version of Catalyst wouldn't install properly, so I dropped back to 13.04.

Installing the driver is still fraught with difficulties. The first couple of attempts failed -- even using the ubuntu libraries with apt-get install fglrx fglrx-amdccle wouldn't work. In the end, the way I got it going was by following the instructions in this askubuntu answer. In particular, I think getting all the dependencies is critical.

The proprietary driver may have an unfair advantage in my tests drag/scroll tests, since I found the "Tear Free" switch. I'm not sure if such an option exists for the open source drivers.

So, I had better luck than last time round, but the installation process is still daunting, and dual-screen support would be good to try out.

Thursday, October 17, 2013

Fixing KDE Games Menu Crash

After installing the game "A Virus Named Tom" from the Humble Bundle 9, trying to view the "Games" menu in KDE started crashing.

The problem is due to the file avirusnamedtom_com-avirusnamedtom_1.desktop in ~/.local/share/applications. It contains a reference to an image where either a) the reference is invalid because it contains spaces, or b) the image is unreadable.

To fix the problem, I edited the Icon line from:

Icon=/home/ash/Software/games/installed/
  VirusNamedTom/A Virus Named TOM.bmp

to:

Icon="/home/ash/Software/games/installed/
  VirusNamedTom/A Virus Named TOM.bmp"

(Line break above is for clarity -- there is no line break in the file).

After logging out and logging back in, the Games menu should work again (although the icon for AVNT still doesn't appear properly, and I can't open the file in the default image viewer, which makes me think it's broken in some way. GIMP can open it without any problem, so I'm not sure what the issue is).

An alternative solution is to rename or delete the .desktop file, but then you lose the AVNT menu entry.

Also note this should fix a crash when clicking on the menu and selecting "Edit Applications..." -- which also occurred until I made this fix.

Hat tip to abelthorne, who posted the clue that led me to track this down.

Update 17/10/2013:
After a bit more experimentation, I found that the image can be made to work in the Games menu by saving as it as .PNG in GIMP, removing the spaces in the filename, and removing the quotes in the Icon entry in the .desktop file. There are some weirdnesses or limitations in the .desktop format it seems, and also something strange with the .BMP that comes with AVNT.

Thursday, September 26, 2013

MakeMKV + Handbrake

Handbrake is a great transcoding tool, but I've come across situations where it refuses to format shift some DVDs. To get around it, I installed MakeMKV which knows how to do such things.

The process is a little more drawn out though, here's how I did it:

  1. Download the MakeMKV binary and source as per the MakeMKV Linux wiki page.
  2. In a command prompt, execute the following commands (assuming version numbers are the same):
  3. $ tar xvf makemkv-bin-1.8.5.tar.gz $ tar xvf makemkv-oss-1.8.5.tar.gz $ cd makemkv-oss-1.8.5/ $ make -f makefile.linux $ sudo make -f makefile.linux install $ cd ../makemkv-bin-1.8.5/ $ make -f makefile.linux $ sudo make -f makefile.linux install $ makemkv

  4. The last command starts the newly installed makemkv GUI. The GUI is fairly easy to work through, but this guide for using it is pretty good. I didn't need to change any of the default settings, just let it do its thing.
  5. Once MakeMKV has ripped its stuff, it will create an MKV file. You can then load this file into Handbrake and transcode into any other format as normal. The MKV can be deleted after transcode if you don't want it anymore (it can be pretty big).

Update for MakeMKV 1.8.10 Beta (14 Jun 2014):

Current instructions on the wiki page work fine, not sure why I had slightly different process above. You also need to register a key to use the beta.

Monday, September 23, 2013

Review: Rapoo RP-X1800 Wireless Keyboard/Mouse

Since my Shintaro wireless keyboard decided to go on the fritz, I needed a cheap replacement while the Shintaro goes off on its warranty journey.

The Rapoo RP-X1800 was $19, and I have to say I'm pretty impressed with it. They keys are a little squishy (as to be expected with a cheap keyboard), there is no caps lock LED (also common for wireless keyboards) and no off switch on the keyboard (there is one on the mouse).

But the wireless connectivity was flawless. Compared to the Shintaro, which was never quite right, the Rapoo just worked straight away with no key presses lost, and moving the mouse will wake the computer from screen saver, something the Shintaro trackball never did, to my annoyance. The wireless USB adapter isn't as small as a Logitech one, but is smaller than Shintaro's.

The biggest drawback for use as a HTPC input device is the separate mouse and keyboard parts, but I'd have no hesitation recommending the Rapoo X1800 as a solid budget wireless KB/mouse combo.


Update 14-Dec-2013:

After running with this combo for a while now, I have noticed occasional glitches when typing, usually trying to type something really fast. Isn't much of an issue (doing passwords is probably the most annoying), but I didn't notice this issue when I first started using the Rapoo so thought I'd update to mention it.

For me Logitech is still the best as far as quality of wireless connectivity goes.

Saturday, September 21, 2013

Conky Config

(Update 25-Oct-2014: Slightly updated version posted.

Just installed conky and looked around for some cute configs. This one by asoliverez looked nice, so I grabbed it and started playing around.

It didn't quite work for all values, since I didn't have all the pre-requisite software installed. So I hacked it up a little bit, and came up with the following that works with sensors (I worked out how to do it using this guy's example as a template), nvidia-smi (since I've got an nVidia GPU installed) and my particular file systems setup:

File: .conkyrc

background no
font Sans:size=8
#xftfont Sans:size=10
use_xft yes
xftalpha 0.9
update_interval 5.0
total_run_times 0
own_window yes
own_window_type normal
own_window_argb_visual true
own_window_transparent yes
#own_windiw_class conky
own_window_hints undecorated,below,sticky,skip_taskbar,skip_pager
# To make conky always on top, swap 'below' in above line to 'above':
#own_window_hints undecorated,above,sticky,skip_taskbar,skip_pager
double_buffer yes
minimum_size 220 5
maximum_width 220
draw_shades yes
draw_outline no
draw_borders no
draw_graph_borders yes
default_color CDE0E7
default_shade_color black
default_outline_color green
alignment top_right
gap_x 12
gap_y 35
no_buffers yes
uppercase no # set to yes if you want all text to be in uppercase
cpu_avg_samples 2
override_utf8_locale no

TEXT
${color white}SYSTEM ${hr 1}${color}

Hostname: $alignr$nodename
Kernel: $alignr$kernel
Uptime: $alignr$uptime
MB Temperature: ${alignr}${iconv_start UTF-8 ISO_8859-1}${exec sensors|grep 'Physical id 0'|awk '{print $4}'}${iconv_stop}
CPU Temperature 0: ${alignr}${iconv_start UTF-8 ISO_8859-1}${exec sensors|grep 'Core 0'|awk '{print $3}'}${iconv_stop}
CPU Temperature 1: ${alignr}${iconv_start UTF-8 ISO_8859-1}${exec sensors|grep 'Core 1'|awk '{print $3}'}${iconv_stop}
CPU Temperature 2: ${alignr}${iconv_start UTF-8 ISO_8859-1}${exec sensors|grep 'Core 2'|awk '{print $3}'}${iconv_stop}
CPU Temperature 3: ${alignr}${iconv_start UTF-8 ISO_8859-1}${exec sensors|grep 'Core 3'|awk '{print $3}'}${iconv_stop}
Fan 1: ${alignr}${hwmon 1 fan 1} RPM
Fan 2: ${alignr}${hwmon 1 fan 2} RPM
#Battery: ${alignr}${battery_percent BAT0}%
CPU: ${alignr}${freq} MHz
GPU Temp: ${alignr}${exec nvidia-smi | grep '. ..\% ..C'|awk '{print $3}'}${iconv_start UTF-8 ISO_8859-1}°${iconv_stop}C
Processes: ${alignr}$processes ($running_processes running)
Load: ${alignr}$loadavg

CPU1 ${alignr}${cpu cpu1}%
${cpubar cpu1}
CPU2 ${alignr}${cpu cpu2}%
${cpubar cpu2}
CPU3 ${alignr}${cpu cpu3}%
${cpubar cpu3}
CPU4 ${alignr}${cpu cpu4}%
${cpubar cpu4}

Ram ${alignr}$mem / $memmax ($memperc%)
${membar 4}
swap ${alignr}$swap / $swapmax ($swapperc%)
${swapbar 4}

${color gray}Highest CPU $alignr CPU% MEM%${color}
${top name 1}$alignr${top cpu 1}${top mem 1}
${top name 2}$alignr${top cpu 2}${top mem 2}
${top name 3}$alignr${top cpu 3}${top mem 3}

${color gray}Highest MEM $alignr CPU% MEM%${color}
${top_mem name 1}$alignr${top_mem cpu 1}${top_mem mem 1}
${top_mem name 2}$alignr${top_mem cpu 2}${top_mem mem 2}
${top_mem name 3}$alignr${top_mem cpu 3}${top_mem mem 3}

${color white}Filesystem ${hr 1}${color}

Root: ${alignr}${fs_used /} / ${fs_size /}
${fs_bar 4 /}
Files: ${alignr}${fs_used /files} / ${fs_size /files}
${fs_bar 4 /files}

${color white}NETWORK ${hr 1}${color}

Eth0: ${addr eth0}
Down ${downspeed eth0} k/s ${alignr}Up ${upspeed eth0} k/s
${downspeedgraph eth0 25,107} ${alignr}${upspeedgraph eth0 25,107}
Total ${totaldown eth0} ${alignr}Total ${totalup eth0}

Wlan0: ${addr wlan0}
Signal: ${alignr}${wireless_link_qual wlan0}%
Down ${downspeed wlan0} k/s ${alignr}Up ${upspeed wlan0} k/s
${downspeedgraph wlan0 25,107} ${alignr}${upspeedgraph wlan0 25,107}
Total ${totaldown wlan0} ${alignr}Total ${totalup wlan0}

(Text wrap doesn't work real well...sorry. It should cut and paste okay though if anyone is interested in it).

(Update 23-Dec-2013: Added option to keep conky "always on top").

(Update 27-Sep-2014: Fixed CPU bar always showing same % issue).

Looks pretty good sitting there over in the corner of the screen:

Sunday, September 15, 2013

Fixing Tearing with Kubuntu 13.04 and nVidia 304.88

UPDATE 26th May 2014: I tried this same fix on Kubuntu 14.04, and it black-screened the system. I'd also been tinkering with some other stuff at the same time, so haven't tracked it down exactly (or what the new fix might be), but I suspect it is the GL_YIELD change to /etc/profile below. I don't recommend doing it unless you're ready to rollback/recover the original.. UPDATE 20th Oct 2014: Tried a very similar fix on 14.04 and this time it worked, so not sure what was going on here.

After finding a workaround to fix tearing on Mint 15 Cinnamon, I jumped to Kubuntu 13.04 because there were some strange issues with screen recording in Mint, I assume because of the tearing workaround affected some of the internals in the graphics stack that screen capture uses.

Unfortunately, Kubuntu had the same tearing issue that Xubuntu exhibited. The solution again was a workaround, but different:

  • Install KDE 4.11 as per the instructions at this noobslab article:

    sudo add-apt-repository ppa:kubuntu-ppa/backports
    sudo apt-get update
    sudo apt-get dist-upgrade

    (Not sure if this was actually necessary, but it was one of the other things I tried first).
  • In "Desktop Effects -- KDE Control Module" => Advanced tab, set Compositing type to "OpenGL 3.1" and Tearing Prevention (VSync) to "Re-use screen content".
  • Add:

    export __GL_YIELD="USLEEP"

    To /etc/profile as per this KDE forum post. NOTE: But see above regarding this possibly having problems in more recent versions.

That should fix the tearing.

Note on Fullscreen Games

For some reason, the above worked in the general desktop environment, but fullscreen games were still tearing. The fix that worked in that case was to uncheck "Suspend desktop effects for fullscreen windows". Not sure why that made things better, since I would have thought the other way around would work, but that's the change that fixed it.

Things That Didn't Work

For reference, here are the things I tried that didn't work:

  • Turn off compositing with Alt+Shift+F12 (no effect).
  • Install KDE 4.11 (as noted above, this by itself didn't help, but may be necessary for the workaround).
  • Install compiz. The tearing was perhaps a little better, but still not great, and compiz just doesn't look as good as kwin.

And one last resort I would have tried eventually: Re-compiling KDE with this patch.

Running Card Hunter in Linux

I read a recent PAR article on new browser-based game Card Hunter so I signed up to give it a shot.

Unfortunately if you're using Linux and Firefox, depending on the version of the flash player you have installed, it probably won't work.

To get it going I installed Chromium and the Pepper flash plugin, following these instructions (as referenced in the Card Hunter forums). This works perfectly.

Saturday, September 14, 2013

New Linux Install -- Software List

Software and setup for a new distro install:

Setup/Config

  • /etc/fstab SSD tuning
  • Case insensitive bash
  • Copy over .fonts folder
  • Copy over .xsetwacom, .vimrc, .hgrc

Software

  • GPU drivers (if needed)
  • Firefox, plus add-ons:
    • AdBlockPlus
    • NoScript
    • FlashGot
  • Mercurial
  • Thunderbird
  • Gvim
  • KeepassX
  • ia32-libs
  • Handbrake
  • Unetbootin
  • Imagemagick
  • VLC
  • lm-sensors
  • Gimp
  • Krita
  • Eclipse

Optionals/As Needed

  • VirtualBox
  • Wine
  • Wallch
  • SimpleScan
  • k3b, xfburn (or some other DVD burner)

Experimentals

  • Tupi2D
  • recordMyDesktop
  • Vokoscreen

Tuesday, September 10, 2013

Budget Small(ish) Web Browser Box

I wanted a small box to sit in the corner as basically nothing more than a web browser. Since mITX parts were so difficult to source locally, I went the smallest mATX case I could find.

Here is the parts list:
  • CPU: AMD A4-5300 $55
  • Mobo: Asus F2A55M-LK-Plus $49
  • RAM: 1x4GB 1600 (already had)
  • SSD: Plextor M5S 128GB $89
  • Case: Coolermaster RC361 $49
  • PSU: Corsair VS450 $49
  • KB/Mouse: USB combo (already had)

Total: $291 (would have been about $350 if I didn't have RAM/KB/mouse already).

Some happy snaps:

The motherboard, fresh out of the box. Notice how it captures the light...(because of my crappy photography)

The motherboard was fairly low end, but was a good price so I grabbed it. It doesn't have a USB 3 header of SATA 3, but since the build didn't need either, it filled the requirements.

CPU, pins pins pins

Motherboard with RAM, CPU and HSF mounted

Mounting the AMD HSF was straightforward: stick it on the CPU and clip it into place. Easier than Intel's "four pins" mechanism, which takes a little getting used to.

The empty case

The Coolermaster RC361 is about the smallest micro-ATX case I could source locally. It's a "sideways" mini tower, so can stand upright or on its side. For a case that cost less than $50, it's not too bad.

Case with the power supply (tucked away, unusually, in the front of the case)

Mounting the power supply was fairly tricky, and given its unusual location, there's no way to turn the PSU on or off without pulling the front of the case off (which, unlike some cases, came off easily). I don't think access to the PSU switch will prove too much of a problem for my uses. The case also has no place to mount an SSD. Since I didn't have a mounting kit, I just taped it to the bottom of the case. The RC361 does come with a comprehensive set of extra screws and cable ties.

The completed build. SSD isn't visible because it's taped to the bottom of the case

Installed Kubuntu 13.04, it worked really well. The AMD A4-5300 (with proprietary Catalyst drivers installed) will even play some games at a reasonable level (if with low settings).

Sunday, August 25, 2013

Installing SmoothDraw on Linux

At the moment, SmoothDraw is my "one application" that means I have to keep my old dusty Windows box in the corner.

I investigated installing SmoothDraw on Linux under Wine, and the good news is that you can get it to work, but there are caveats.

To start with, the latest version of SmoothDraw (version 4) requires .NET Framework 4.0. I couldn't get this running properly under Wine.

But winetricks can be used to install .NET 3, and with this running you can get SmoothDraw3 to work. The only drawback is that tablet input loses its pressure sensitivity (think it's treated as a regular "mouse-like" device by the wine driver).

An option to get pressure sensitivity working was the SAI (1.5.5) version of wine specifically set up for this. But then I couldn't use winetricks, and only winetricks seems to "know" how to install .NET...

Anyway, having it work allows me to open, view and manipulate existing files, so it's worth having. This is the process for installing SmoothDraw3 under wine 1.4.1 (pieced together from my history...forgot to write it up at the time :/)

$ sudo apt-get install wine
$ export WINEARCH=win32
$ winetricks dotnet35
$ wine SmoothDraw3Setup.exe
$ wine .wine/drive_c/Program\ Files/SmoothDraw/SmoothDraw3.exe

Sunday, August 11, 2013

CSS Holy Grail: Sadness

Saw an article the other day on the "holy grail" of CSS layout: the three column equal height liquid layout.

The sad thing is that after all this time, there isn't a simple pure-CSS solution to this problem that doesn't involve extra divs and hacky behind the scenes manipulation. :(

Saturday, August 3, 2013

DDR3 Dual-Channel vs Single-Channel Performance

It was suggested to me that with modern CPUs/motherboards, running with dual-channel memory was unnecessary because there's no difference. My understanding was there is a modest performance increase in using dual-channel. I decided to do a quick test to see.

For the test I used Handbrake (0.9.9) to transcode a 187MB 1080p video. This isn't a true benchmark because the conditions are maybe too uncontrolled (hence it's a "quick test"), but I thought using a real-world application like transcoding would give an idea of the performance difference if any.

I swapped between a single 4GB RAM stick and 2x2GB sticks. It was in my low-powered HTPC, so it took a while. These are the results:

RAM typeRun #Time taken
1x4GB14m 55s
1x4GB24m 48s
1x4GB34m 49s
 
2x2GB14m 36s
2x2GB24m 37s
2x2GB34m 35s

It works out to about a 5% performance improvement using dual-channel, which is consistent with other things I've read. How that translates from a processor intensive task like transcoding to everyday use I'm not sure, I suspect it would be unnoticeable though.

Rest of the system specs, for reference:

  • OS: Xubuntu 12.04.1 (64-bit)
  • CPU: G2020
  • Mobo: Asus P8B75-M LX
  • RAM: G-Skill NT 1x4GB or G-Skill NT 2x2GB
  • SSD: Kingston V300 60GB
  • HDD: Toshiba 1TB 7200RPM
  • GPU: Gigabyte GT610
  • PSU: Antec EarthWatts 380

Wednesday, July 24, 2013

Patent Killer

Very interesting post by Joel Spolsky on how to get a patent killed.

For anyone interested in writing software and not having to play Russian Roulette with the quagmire of bad patents out there, well worth a read.

(For any software developers out there: hands up who has actually read a patent to learn how to do something?)

Monday, June 17, 2013

Fixing nVidia Graphics Tearing on LinuxMint Cinnamon

So, after installing an nVidia GTX 650TI a while back, I put up with tearing in pretty much all applications because it was easy to install and get going.

But eventually I got sick of it and tried to fix it. After trying about every fix I could find, I jumped to a different distro (from Xubuntu 12.10 to Mint 15) in the hope that might have some effect. It didn't, but I re-tried a fix I thought I'd already tried, and found it eliminated the tearing 100%.

The fix is to add this to the /etc/environment file:

CLUTTER_PAINT=disable-clipped-redraws:disable-culling
CLUTTER_VBLANK=True

This is using the nVidia proprietary driver 304.88. Hope to confirm on other Xubuntu systems whether this fix also works.

Update 20 June 2013:

Unfortunately, the fix only seems to work in Mint/Cinnamon. Title updated to reflect this.

Saturday, June 15, 2013

Fixing "mojo.run: No such file or directory" error

When trying to install something with a .mojo.run extension (for example, I was installing Dungeon Defenders from the Humble Bundle), if you get this error:

mojo.run: No such file or directory

The solution (thanks Kazade!) is to install the ia32-libs package. Then the installer should work.

(Posted this because it took me way too long searching to stumble across Kazade's post).

Update 20/10/2013:

I noticed that ia32-libs is no longer an installation target in 13.10. There doesn't seem to be an easy way to fix this, since some suggested mechanisms don't work when you've got a mojo.run file.

The only way I found to do it is detailed here: http://wiki.phoenixviewer.com/ia32-libs-in-ubuntu-13-10

The version of synaptic I was using was slightly different to that described above. In Step 4, it was "click New" rather than "other software -> add", and in step 5 the values to insert are in four separate lines:

  • Dropdown box: Binary (deb)
  • URI: http://archive.ubuntu.com/ubuntu/
  • Distribution: raring
  • Section(s): main restricted universe multiverse

Monday, June 10, 2013

Fix broken xfce4 desktop after uninstalling compiz

So, to test a theory on how to fix graphical tearing on a Xubuntu 12.04.1 install, I tried installing compiz.

Not only did it not help, but it also broke the desktop environment. So I uninstalled (followed the reverse of the compiz installation steps I did). Uninstalling though doesn't clean up everything, and the desktop was still broken.

Fortunately, I found this post which suggested removing ~/.config and ~/.cache.

You don't need to zap those entire directories though. Inside each one there is a subdirectory that starts with "compiz". Those are the directories that need deleting. After that, I rebooted and the regular xfce4 desktop was working again.

Friday, May 31, 2013

Review: Logitech K400 vs Shintaro Wireless Multimedia Keyboard

I was tossing up between a Logitech K400 and a Shintaro Multimedia (trackball) keyboard for my HTPC. I got the Logitech first for $35, but wasn't entirely happy with it so I got the Shintaro as well ($38+postage) for a comparison. These are their stories...

Shintaro Wireless Media (above) and Logitech K400

My initial reaction was that the Shintaro has a much more solid build than the Logitech. The mouse buttons were clicky and "alive", compared to the gummy feel of the K400 buttons, where you're never quite sure if you've actually pressed it.

The key layout was also much better, and almost for that alone I'd take the Shintaro. The placement of the right shift key and up arrow buttons on the Logitech continually annoyed me. The Shintaro is closer to a "normal" keyboard layout.

Shintaro positives:

  • Solid feel and nice clicky buttons, has a quality about it.
  • The keyboard layout is close to a regular desktop keyboard. Minimal pressing the wrong key when reaching into the shift/enter/arrow key area.
  • I think I prefer the trackball to a touchpad.

Shintaro Wireless Media keyboard

Shintaro negatives:

  • Size of the USB receiver. That thing is enormous. I ended up connecting it via a USB extension cable from the back of the case, because it looked so precarious hanging out the front.
  • Wireless connectivity can perform really badly. Even with a direct line of sight and less than a metre distance, having the keyboard sitting in the wrong place on your knees can mean up to 80\% dropped characters. When it was connected it was fine, but I still haven't quite worked out what positions will cause it to go bad. (Even a direct line of sight < 3m sometimes drops the occasional character when typing).
  • The board "goes to sleep" really quickly. Spinning the trackball doesn't wake it up either, you've got to press a key. I'm so used to bumping the mouse to wake up a computer, it takes a bit to get used to.
  • While the build quality is nice, it is quite bulky.
  • I sometimes had trouble getting into the BIOS with this keyboard.
  • It has a "sync" step you have to perform by pressing a button on the receiver. I seemed to lose sync occasionally, but this may have just been the wakeup problem noted above.
  • Takes four AA batteries, and with an estimated 3 month life, probably falls short of the Logitech in that respect.

Logitech positives:

  • Rock solid wireless connection. I was typing thing in another room with no line of sight.
  • Tiny USB receiver and no need to sync. Just works.
  • Size-wise, the keyboard is nice and compact.

Logitech K400 wireless keyboard

Logitech negatives:

  • The flimsy build quality and gummy feel of the keys and buttons. It just felt really cheap, and was difficult to know when you'd clicked a button.
  • Keyboard layout was problematic. In particular, the Right Shift key is much smaller than normal, with the Up Arrow taking up the space. This means the Up Arrow is easily pressed when searching for the Shift key. Doing any command-line stuff during installation, this was infuriating.

The lengths of the lists don't represent who won in my test, since the negatives to the Logitech K400 and positives to the Shintaro were really big factors in my decision to keep the Shintaro.

Saturday, May 25, 2013

Case Fan Review: Antec TrueQuiet Pro, Aerocool SharkFan and Coolermaster Sickleflow

I started on a vendetta to try and get my computer to run a bit quieter. Although I think the main culprit of noise levels is the stock CPU fan, the case fans in the Corsair 300R were making some noise, so I bought a few other fans to try out.

The fans were:

  • CoolerMaster SickleFlow 120mm ($8)
  • Aerocool SharkFan 120mm Blue Edition ($13)
  • Antec TrueQuiet Pro 120mm ($20)

CoolerMaster SickleFlow (left) and Aerocool SharkFan (right)

I don't have a dB meter -- quantitative testing isn't how I roll -- so I was just going by the ear test.

The CoolerMaster SickleFlow was the cheapest and was pretty loud, louder than the stock case fan.

The Aerocool was initially really loud as well, but then I realised the "extension lead" in the packet for a voltage reducer. With this fitted it reduced the noise to about the same as the stock fan.

The Antec TrueQuiet Pro had a physical switch for adjusting the speed. At full speed it too was louder than the stock fan, but when switched down to low speed it was quieter.

Spending the extra for something like the TrueQuiet Pro is worthwhile if noise levels are important.

Saturday, May 18, 2013

NFS Mount Hangs on Network Between Two Linux Machines

I was trying to set up NFS on my local network to transfer some stuff between two machines. I thought this would be pretty easy, but there seem to be a lot of guides out there that are either out of date or more complicated than they need to be (maybe they include some advanced features, not sure).

The main problem I had was that the mount command would hang when I tried to connect the client to the server. I tried everything I could think of, and in desperation tried reversing the client<-->server direction. At that point, it worked without a hitch. Still don't know exactly what the issue was (some conflict in the setup or configuration of my server machine?), but I was ecstatic at that point it worked at all.

Here are the steps (use ifconfig on each machine to find out their IP address, or use hostnames if you've set up hostnames):

On the nominated "server" machine

  • $ sudo apt-get install nfs-common nfs-kernel-server
  • Edit /etc/exports and add the following line (assuming here that the client IP address is 192.168.1.1 and the directory to be made available is /tmp):
        /tmp	192.168.1.1(rw,sync,no_subtree_check)
    
  • $ sudo exportfs -ra
  • Check that the entry just added to the exports file is okay with: $ sudo exportfs
  • $ sudo service nfs-kernel-server restart

NFS server daemon processes should now be running.

On the nominated "client" machine

Assuming the server IP address is 192.168.1.7 and /files/remote is the directory which we will be mounting to:

  • $ sudo apt-get install nfs-common
  • $ mkdir <local-directory-to-be-mount-point>
  • $ sudo mount -t nfs 192.168.1.7:/tmp /files/remote

An entry to automatically mount can be put in /etc/fstab, but since I will only be using the NFS connection on an ad-hoc basis, I haven't done that at this stage.

Saturday, April 27, 2013

Create a Bootable USB on Xubuntu with unetbootin

To create a bootable Linux installation on a flash drive (using Xubuntu):

  • Insert the flash drive. After it mounts, use df to find the file system name (for example, /dev/sdd1).
  • Install unetbootin if you need to:

    $ sudo apt-get install unetbootin

  • Run unetbootin. *** Requires sudo access. Apparently this is a known issue with unetbootin.
  • I selected the "disk image" option because I'd already downloaded the ISO I wanted:
  • Press OK and it will expand the ISO onto the flash drive. *** Warning: This will delete everything on the drive.
  • Insert flash drive in target machine and boot away. You might have to go into the BIOS and select the USB drive as the bootable device, depending on the motherboard brand.

Update 10/6/2013:
If you want to get rid of the pesky ldlinux.sys off the flash drive that won't even let sudo delete it, do this:

$ sudo chattr -i ldlinux.sys

Then sudo rm ldlinux.sys. (Fix taken from here).

Update 19/10/2013:
If you have an Intel motherboard and boot from a USB drive and get the message "Boot error", it could be due to the BIOS settings. This forum post describes how to fix the problem, which worked on one of my old computers with an Intel motherboard.

Monday, April 22, 2013

Format Flash Drive for Big Files on Linux

By default, flash drives are formatted with the FAT32 file system. FAT32 has a file size limit of about 4.3GB.

To get around this, you can format with a file system that supports bigger files. I chose ext4 for this, you can use ext2, ext3, or others.

Who's a big boy today?

Warnings:

  • You probably won't be able to use the flash drive on Windows machines (maybe this is what you want?)
  • Performance of flash drives under different file systems can apparently vary markedly. I didn't have any issues with mine using ext4.

Here are the commands:

$ df

Use df to find out which device is your flash drive, in my case it was /dev/sdd1. (Make sure you get this right, so you don't blat your hard drive or something).

$ umount /dev/sdd1 $ sudo mkfs.ext4 -L "BigFileDrive" /dev/sdd1

After reformatting, the drive mounted with root as the owner, so I did:

$ sudo chown ash /media/ash/BigFileDrive $ chgrp ash /media/ash/BigFileDrive

And all was well.

Update 11 Sept 2013:

Trying to run this for NTFS (on kubuntu at least) can result in:

The program 'mkfs.ntfs' is currently not installed.
You can install it by typing:
sudo apt-get install ntfs-3g

But it says it's already installed. This is a known bug, a simple workaround is to just run mkntfs rather than mkfs.ntfs.

Sunday, April 21, 2013

Canon MG6250 Scanning on Xubuntu 12.10

The other day a friend of mine challenged me asked me if I'd got scanning going on the Canon MG6250.

I had never tried it, and some research showed other people were having some issues as well.

Here were the steps I took to get it going:

  • Install xsane (sudo apt-get install xsane)
  • The sane man pages refer to "backendname" a lot. The project's documentation gives the backend name for the 6250 as "pixma"
  • man sane-pixma (seems to be a man entry for each backend) tells you to that network scanners should normally be detected, but if not, add them directly to /etc/sane.d/pixma.conf
  • Edit that file and add a line of the format:
    bjnp://<ip_address>
    IP address can be retrieved from the printer settings dialog, or from the options in the printer itself.
  • After adding an entry for the printer, save pixma.conf
  • If the sane daemon isn't running (some have reported that it is running, but I had to start it manually as per the next two steps):
    • Edit /etc/default/saned and set RUN=yes.
    • Then start the sane service: service saned start
  • Run xsane

Now xsane should discover the scanner, and instead of saying "no devices found" and dying, it should run up (brings up about 4 windows). All the default settings seem to work — just press "Scan".

Saturday, April 6, 2013

Logitech C270 Webcam in Linux

TL;DR: Logitech c270 works with Linux; guvcview good.

I bought a Logitech C270 webcam for my Dad, but thought I'd try it out first and see how it works with Linux.

First thing was to install cheese (as per the suggestion at ubuntu forums):

$ sudo apt-get install cheese

(This required a restart).

Then I ran cheese from the command-line -- webcam started up great! First thing I tried was to do a video capture...and got this:

Cannot connect to server socket err = No such file or directory Cannot connect to server request channel jack server is not running or cannot be started (cheese:2518): cheese-WARNING **: Jack server not found (cheese:2518): cheese-WARNING **: Could not initialize supporting library. Segmentation fault (core dumped)

So I tried to install jack and jackd, but this had no effect.

Then this bug suggested installing "gstreamer1.0-pulseaudio".

This worked. Had one core-dump after that, but mostly worked. Cheese complained about not being able to create thumbnails for the videos that were recorded, but I wasn't too concerned about that. By default, pictures go into ~/Pictures/Webcam, videos go in ~/Videos/Webcam.

Screen cap taken with cheese running

Videos recorded with cheese at 1280x960 and 960x720 looked awful. I don't know if this is a function of using webm or something else. Dialling down to 640x480 looked much better.

Only problem: the sound wasn't working.

Couldn't find a readily available solution, so I tried guvcview, after seeing it recommended in this askubuntu question.

$ sudo apt-get install guvcview

This looked like a neat little program. Had a lot more options than cheese. But still sound wouldn't work. Went through all the sound devices in the list. The device ID for the webcam (003 on my machine, as lsusb told me) wasn't in the list. Then I had this anti-brain fart where I recalled that some devices don't work so well in USB 3 ports, and I'd plugged the Webcam into the USB 3 on the front of the case.


640x480 screenshot taken with C270 and guvcview

Plugged it into a USB 2 in the back (restarted guvcview), and BAM! New audio device appears (a "USB Audio"). This worked just fine.

I had to give cheese another go, but still no sound. So I'm not sure what was going on there, but it felt like guvcview gave better control over the capture anyway.

(For the record, they are the Natural Confectionery Co. snakes).

Monday, April 1, 2013

Script to initialise Wacom Intuos 5

To round out the setup for my Wacom Intuos 5 tablet, this is the script I run to initialise it for left-handed use with an nVidia graphics card in Xubuntu 12.10:

#!/bin/bash if [ -x /usr/bin/xsetwacom ]; then xsetwacom set "Wacom Intuos5 M Pen stylus" Rotate half xsetwacom set "Wacom Intuos5 M Pen eraser" Rotate half xsetwacom set "Wacom Intuos5 M Pen cursor" Rotate half xsetwacom set "Wacom Intuos5 M Pen pad" Rotate half # HEAD-0, HEAD-1 identify screens when using nVidia graphics. # Use xrandr output for AMD, Intel, etc. xsetwacom set "Wacom Intuos5 M Pen stylus" MapToOutput HEAD-0 fi

Saturday, March 23, 2013

Configure Mouse Speed in Xubuntu

I found the default mouse acceleration to be way too fast (particularly when trying to click on the single-pixel window borders in xfce).

To slow it down, I followed Patrick Mylund's instructions. These are the results specific to the Logitech G400.

$ xinput --list --short

This shows the names/IDs of input devices. In my case, "Logitech Gaming Mouse G400".

Now create a file ~/.xinput-mouse.sh, chmod it +x to make it executable, and add edit to include the following command:

xinput --set-prop "Logitech Gaming Mouse G400" "Device Accel Constant Deceleration" 4

Add a file xinput-mouse.desktop to ~/.config/autostart with the following contents:

[Desktop Entry] Encoding=UTF-8 Version=0.9.4 Type=Application Name=xinput-mouse Comment=Slow the mouse acceleration Exec=/home/<username>/.xinput-mouse.sh OnlyShowIn=XFCE; StartupNotify=false Terminal=false Hidden=false

Sunday, March 10, 2013

Asus GTX 650TI on Linux

So after suffering through the (self-inflicted) pain of trying to run an AMD GPU under Linux, I bought an nVidia-based GTX 650ti to try out next.

Asus GTX 650ti

The 650ti doesn't have the best reputation for value (with performance that is similar if not worse than the much cheaper HD7770), but it seemed to go okay in reviews and is realistically already overkill for anything I'm going to use it for.

In box

I narrowed the choices down to an Asus version versus the MSI Power Edition. While I'd heard goods things about the MSI PE cards, in the end it came down to:

  • DVI-D ports (compared to DVI-I).
  • HDMI ports (compared to mini-HDMI. I have plenty of HDMI cables, but no mini ones).
  • Low noise level and temperatures in reviews (though the MSI is similar here anyway).

Ports

Installation

Compared to the troubles I had with the AMD, getting the nVidia card up and running is a breeze. Admittedly, I cheated this time and went straight to the proprietary driver. The open-source driver (nouveau) worked fine straight after installing the card, and I'll probably keep an eye on it (it's apparently made some recent advances in capability). But, I was tired, and just wanted something to work, so I cheated.

Plugged in to motherboard

To see which drivers are installed, you can use:

$ sudo lspci -v

For more (or heaps more) information, you can use -vv or -vvv to the above command. Running with sudo gives you a bit of extra output as well.

Initially you'll see a line something like this in the VGA controller section:

Kernal modules: nouveau, nvidiafb

These are the open-source drivers installed automatically in recent kernels.

To install the proprietary drivers, you can use aptitude to look up the possible targets:

$ aptitude search nvidia

To get going, all you need is to do the following:

$ sudo apt-get install nvidia-current nvidia-settings
$ sudo nvidia-xconfig
$ sudo <reboot>

The xconfig command above writes a default file to /etc/X11/xorg.conf. If all installs correctly, repeating the lspci command above will now output something like:

Kernel modules: nvidia_current, nouveau, nvidiafb

Results

Everything worked without a hitch. Running nvidia-settings lets you set up dual monitors. After rebooting, for the first time the login screen was actually two separate screens, rather than mirrored.

There was no obvious increase in noise levels with the Asus card, which is nice.

It's definitely far from perfect though. There are some noticeable artifacts when watching HD video, for example, and some minor tearing while running the game Dungeon Defenders. But at the moment I'm going to take "everything's working and it was easy to set up" over "pixel perfect display". Maybe if it annoys me enough I'll investigate some more, but I'm happy to run as is for the time being.

Update 15/6/2013:

Some months later, I've still been unable to resolve the tearing issue. It also occurs when scrolling up or down in web pages as well as in games/video, and is frustrating beyond belief that such a trivial action causes screen tearing with vendor-provided drivers.

The issue occurs on multiple computers I've used, all with different hardware and distros, so I can only conclude that nVidia's driver is broken.

Update 3/7/2013:

Forgot to link to it, but I ended up fixing the tearing issue. Unfortunately, I found it only worked with Mint/Cinnamon rather than Xubuntu.

Splitting Large MP4 Files

Flash drive formatted with FAT32 has a ~4.3GB maximum file size. I wanted to use the USB to watch the movie on my TV, but it wouldn't fit. To split a large mp4 file into more manageable chunks, I found this forum post on MP4Box. It works great.
$ sudo apt-get install gpac
$ MP4Box -split 3600 <filename>.mp4
The above splits the file into one hour chunks.

Monday, February 11, 2013

Linux SSD Setup

A solid state drive (SSD) is often the cheapest way to improve all-round performance in a computer.

There are many guides for setting up an SSD for each operating system. When I did a recent reinstall of Xubuntu, I looked through a few of the guides and picked what I felt were the most important things. To my mind, these are the two biggest things:

  • Ensure that the drive controller is running in ACHI mode. This is an option in the BIOS. (Note: With Windows, you need to change this mode before installing. I haven't tried changing it with a Linux install, but it will trash a Windows install).
  • Edit /etc/fstab to add "noatime,nodiratime,discard" options for the SSD partitions.

Since SSDs have a lifetime measured in "number of writes", much of the tuning advice is aimed at reducing the number of unnecessary writes.

Here's a few other things I did.

WARNING: I based doing this on the principle of "OS/applications on SSD, data/media on HDD", and "reduce unnecessary writes". Whether or not they are good things to do (particularly, mapping /tmp to a different drive while the system was running), I have no idea. It worked for me, but I didn't base this activity on any existing guide.

System Setup

I have one SSD (Samsung 830 128GB) and one HDD (Seagate Barracuda 2TB). During installation, I partitioned the SSD into ~80GB for the "primary" OS and the remainder for "experimental" OS installs.

I allocated the swap space to the HDD, since this could case a lot of writes if the system ever needs to swap out (probably rare, given the RAM available). I mounted the HDD as "/files".

Mapping Files in User Home Directory

By default /home contains all the user's files. When I was using Mint, it automatically created "Documents", "Downloads", "Pictures", "Videos" etc. in the user home directory. I maintained this with Xubuntu (can't remember if Ubuntu variants do this by default), but replaced the true directories with symbolic links to the equivalent directories in the HDD (in the /files partition).

For example:

    $ rmdir Documents
    $ ln -s /files/ashley/Documents Documents

This means that all these files are stored on the HDD, keeping the SSD free from associated writes.

Mapping Email and Browser Data

The user's home directory also contains email and browser data. So I symbolic linked my thunderbird mailbox to the HDD as well. This Firefox support post explains how to move the Firefox cache to another drive. In hindsight, I probably should have just linked the entire Firefox folder as well.

Remapping /tmp

I remapped /tmp from the SSD to the HDD. This one was a bit of an experiment. It could have killed my system, I suppose, but everything seemed okay, so I'll explain what I did.

The system uses the directory /tmp to store random runtime stuff as needed. It's a special directory in that anyone can write to it. You need to set the "sticky" flag for this. If you do an "ls -l" on it, you'll see something like this:

    drwxrwxrwt  2 user user 4096 Nov 19 20:13 tmp

The "t" character at the end of the first column indicates the sticky bit is set. This is what I did to move my /tmp directory:

    $ sudo mkdir /files/tmp
    $ sudo chmod 777 /files/tmp
    $ sudo chmod +t /files/tmp
    $ sudo rm -rf /tmp
    $ sudo ln -s /files/tmp /tmp

The reason I noticed I needed to set the sticky flag (the "chmod +t"), is that without it, filename completion in the terminal stopped working. I imagine a whole heap of other stuff would have to.

Also, when I look at my tmp folder now, the "t" flag no longer appears. Not sure why, but everything still seems to work.

So, there it is: cowboy setup for reducing SSD writes.

Tuesday, February 5, 2013

AMD HD7770 GPU in Linux


Update (Oct 2013): For anyone who stumbles across this, I've posted an update where I got the proprietary driver to work acceptably (but only on single screen).


After researching and asking around on forums about GPUs and Linux, I decided to get an AMD Gigabyte HD7770 1GB as an experiment. From what I could gather, AMD support in Linux was pretty bad, (or at least, more problematic than nVidia cards), so I went in with low expectations, willing to wear the difference of flogging it off secondhand if it didn't work out.

TL;DR: It wasn't all bad, but nothing "just works" (as was pointed out in forum responses). From my experience, I have to conclude that the AMD Linux drivers are effectively broken at the moment.

Rationale

Despite being warned away from AMD cards, the 7770 is an entry-level gaming card and not overly expensive. It was also overkill for the games I expect I would be trying to play (of which there isn't a huge selection on Linux yet anyway).

Since it wasn't too expensive ($119), I was willing to take the risk and try it out, partly for the chance to experiment, partly because AMD are reckoned to be good value for money in that price range, and partly because AMD at least supports an open source driver (for varying degrees of "supports").

Machine and OS Specs

Hooked into the i5 3470 system I built in December 2012, running LinuxMint 14 (Nadia) Cinnamon 64-bit.

On with what I found.

Card Installation

Quick and easy. Dropped right into the PCI-E slot without the firm coaxing that RAM and SATA cables usually need. I chose the second port because it looked like the USB 3.0 cable might get in the way of the card. After I put it in, I thought it probably wouldn't have been a problem, but I left it where it was.

Plug in the 6-pin power cable and away we go.

Note: If I'd got the slight less powerful 7570 (which I was considering) I wouldn't have needed the power, but I got a beefy PSU and it seemed a shame to not try out some of those cables sitting around in the bottom of the case).

With the GPU plugged in the on-board graphics is disabled. This is (I gather) expected behaviour.

Open Source AMD Video Driver

Single Screen

After plugging in the GPU and booting up, I didn't have have to do anything in particular. LinuxMint 14 has drivers installed by default to drive the GPU.

First test I plugged into the DVI port. It started up fine at full resolution.

A command you can use to see how the system has detected your GPU is:

    lscpi --nn | grep VGA

Mine displayed "[Radeon HD 7700 Series]"

Dual Screen

Then I added in the second monitor. If you've ever heard someone claim the dual-monitor support in Linux is limited, you can believe them.

Initially with both plugged in, only the HDMI monitor would work. The other monitor (plugged into the DVI, which was originally working), displays its "Current input timing not supported by monitor display" message.

I worked out how to get things going by using xrandr, a command-line tool that configures the displays.

By itself, executing xrandr outputs details about the detected displays. You need to know the "names" the system has given the monitors plugged into each port in order to control their output settings. In my case, the monitor plugged into the DVI port was "DVI-0", and the one plugged into the HDMI port "HDMI-0".

To make my main screen on the left and the second screen on the right, I used the following command:

    xrandr --output DVI-0 --auto --output HDMI-0 --auto --right-of DVI-0

This works, but leaves the taskbar on the right hand side screen (HDMI-0 in this case). To make the system use a particular screen as the "primary", use the following command:

    xrandr --output DVI-0 --preferred --auto --primary

This shifts the taskbar to the DVI-0 display.

Saving Dual Screen Setup -- Attempt 1

So, after using these commands to set up the displays as desired, I had to work out how to make the changes permanent. The "normal" way to do this (if that's the right word) is to put the configuration into the /etx/X11/xorg.conf file.

Using this answer, I adapted a minimal xorg.conf file and came up with this:
Section "Monitor"
  Identifier "First monitor"
  Option     "PreferredMode"   "1920x1200"
EndSection

Section "Monitor"
  Identifier "Second monitor"
  Option     "PreferredMode"   "1920x1080"
  Option     "RightOf"          "First monitor"
EndSection

Section "Device"
  Identifier  "Radeon HD 7700 Series"
  Driver      "radeon"
  Option      "DVI-0"   "First monitor"
  Option      "HDMI-0"   "Second monitor"
EndSection

Something went wrong when I ran it though. At least, the HDMI-0 monitor got a new name, and became "HDMI-3". So it didn't quite work.

Saving Dual Screen Setup -- Attempt 2

Next I thought I'd try and set things up with xrandr, then go into the Preferences/Display dialog and use the "Keep current configuration" option to save the setup.

This creates a monitors.xml file in ~/.config with the current settings. However, it only applies after you login. The HDMI screen, if plugged in, always seems to be considered the "primary".

Saving Dual Screen Setup -- Give Up

I tried a couple of other wild and fancy things that I found on various sites to set up dual screens, but in the end just gave up. No matter which way I plugged the monitors in, if the smaller 23" LCD was attached, it was the one that "lights up" for the splash screen. After login the 24" came to life, but not before.

I couldn't work out how to get both screens going at the system level, rather than the user level, so I gave up.

Open Source Driver Performance

First test was running a fullscreen 1080p video. It ran with lots of glitches and jaggies when playing. The CPU went up to 50-55 degrees on all cores -- was is actually processing in software?

Then I downloaded the Phoronix Test Suite. The suite package itself is reasonably small, but the full test suite takes over 5GB of download and can take a few hours. I managed to run a few of them, but none of them ran very well. For example, Nexuis looked like it was running at about 0.1 FPS.

It was clear that, unfortunately, the open source driver was not even close.

Proprietary AMD Video Driver

It seemed I could choose between the Catalyst driver from AMD's site, or one of the fglrx drivers that were listed in the software centre. I still haven't worked out what the difference between these two things is -- I think they're related in some way, but I'm not sure how.

So after looking at askubuntu.com/questions/142627, I did:

    sudo apt-get install fglrx-updates flgrx-amdcccle-updates

Bam! Upon reboot, the video was completely broken. I could see the motherboard post, and then nothing -- not even a prompt. The onboard video also wasn't working (at the time I didn't realise you have to unplug the GPU completely for the motherboard to activate the onboard video).

So, at a loss, I booted up and started typing blind the following steps:

  • Reboot
  • Ctrl-Alt-F1 (open a console, that wasn't visible at all)
  • Enter username
  • Enter password
  • sudo apt-get -y remove --purge fglrx-updates
  • Enter password again (for the sudo)
  • <wait>
  • sudo reboot

Fortunately, this fixed the problem. So I can't really recommend installing the fglrx-updates variant of the driver.

Next I tried installing the Linux Catalyst driver from AMD's website. This was version 12.11beta at the time (see upubuntu.com/2012/08/install-amd-catalyst-128-on-ubuntu.html for the steps I followed).

This driver seemed to work fine (it at least didn't black out my screen) but came with the "beta" watermark in the bottom right. To get rid of that, follow the instructions at askubuntu.com/a/216730.

Dual Screen -- Proprietary

Unfortunately, getting dual screens set up with the proprietary driver was, if anything, more difficult than with the open source driver.

I tried the following mechanisms:

  • Steps at unixmen.com/howto-install-ati-driver-in-ubuntu
  • Then superuser.com/questions/395927, but the Virtual 3820 caused tearing when the taskbar animated up
  • Then I used arandr to get the 2nd display going
  • Then I tried the amdcccle (admin) application to apply "Digital Monitor(2) -> Adjustments -> Scaling (0%)" This resulted in another refresh problem on the second monitor, where it would only update while the taskbar on the primary screen was animating during hover.

So I gave up on the dual screen idea in order to get some testing done for the driver. With the dodgy dual screen setup, running the test application fgl-glxgears got 500-600 FPS, but was very choppy.

Proprietary Driver Performance

To revert to a single screen, I entered the following at the command line:

    aticonfig --initial -f

Now running fgl-glxgears got around 2500-3000FPS. So, same driver but running with only a single screen instead of dual screen improved the performance of this benchmark application by 6x, and it no longer looked choppy. But this is a pretty simple little application, I wanted something with a bit more meat.

So I installed xonotic and ran it up fullscreen (1920x1200) with the highest settings I could set. Within the game is ran just fine (too fast for my old hands), but my old eyes couldn't discern any difference between the "default" versus the "high" settings, so I'm not sure if I did it right.

But, after running this (or, it seemed, any "fullscreen" game), the taskbar animation back in the desktop became choppy and caused tearing all over the screen.

Last Ditch

All this playing had taken a couple of weeks by this stage. I got recommended to run the sgfxi script at smxi.org (a "fix it all" graphics driver script). When I downloaded it and ran, it exited saying I needed to run it from outside X.

Which was fine, I thought, so I tried to Ctrl-Alt-F1 into a console. But I couldn't see the console, at all. I could see the X session, but none of the TTY console (1 through 6) were working.

Somewhere along the way I'd borked those. By this time I'd had enough "experimentation".

Blew away Mint. I'm now running Xubuntu with the onboard video. Anyone want to buy a barely used Gigabyte HD7770?

Conclusion

My experience has led me to conclude that both the open source and proprietary drivers for AMD GPUs on Linux are broken. The Intel drivers for the onboard video aren't without their quirks (for example, I find it really hard to configure the login screen to start on a specific monitor at a specific resolution, and setting up dual monitors is still a bit of a chore). But for the most part they work.

With the AMD proprietary drivers I had crashes, no screen at all for a while, and dual screen setup was diabolical. With the open source driver I had tearing and I wasn't quite sure it was actually using the GPU to do anything.

Hopefully I'll get an nVidia card at some point to get another perspective, but I'll be starting with pretty low expectations.