30/10/2015

You spin me right round...

As part of another project, a group of us got early access to the structure scanner - a 3D capture device primarily aimed at iOS devices. You simply attach the structure to an iPad and proceed to slowly walk around whatever it is you're scanning. Afterwards you get an OBJ file you can use with whatever modelling software you want.


Armed with this cool piece of kit, how best to show it off? Remembering that we have 3D printers in the hardware lab we set about making busts of our own heads. The app takes about 3-5 minutes to fully capture an object, and works best for items that are ~20cm square. Much smaller and you don't have enough resolution to see things clearly, and as you go bigger things take exponentially longer. Luckily head size is perfect!


Normally to print on the 3d printers that the department owns you need to use a program that interfaces with the printer (like prontaface) as well as an intermediate program to convert model files to gcode instructions that tell the printer head where it should move (such as slic3r).However, after someone hinted that Windows 10 had some level of built in support for 3d printers, that seemed like a more interesting (if not necessarily better) option. Much to our surprise, it's amazing. You just plug the printer in, open a model, and click print. Some scaling issues notwithstanding, it's so much easier and more effective than the previously mentioned manual approach, at least for simple tasks.


The results are pretty cool. It's a high enough resolution that you can tell it's a model of me (and there were higher resolution options available in the windows model printer), and the whole process is pretty straightforward. Being able to scan things in makes having the printer a lot more useful, as having to model a replacement bracket or whatever yourself never really works out as well as you think it will.


26/10/2015

Up, Up, and Away!

UWCS is currently collaborating with Warwick Aerospace (formerly Warwick Rockets) to build and launch a high altitude balloon (HAB) consisting of a payload of some electronics attached to a large helium filled balloon. Most HAB projects reach about 100,000ft before the pressure gets too much for them and they explode before plummeting back down to earth.

We actually started working on this last year, but a number of technical setbacks and poor planning meant that we didn't finish it before the end of term three. This post is mainly going to focus on what we've done so far, so that part two can pick up the action from this year.

The main reason that the project (later named the MilePiClub) failed to get off the ground became apparent when we stopped to look at the equipment we had selected. It turns out that using an Arduino GPS unit, an Arduino GSM unit and an Arduino radio module with a Raspberry Pi was not the best of ideas (I still don't quite understand how none of us noticed this). Nonetheless we struggled on, but it was the technical challenge of coercing all of these hardware modules to do things that they weren't intended to do that made us miss our end of term deadline. As a result of this we've decided to swap to an Arduino controller, which should reduce the amount of hair tearing considerably.

The payload in it's entirety is actually a fairly complex thing. Using GPS to determine its location, the unit then decides whether to use Radio or GSM (text messages) to relay this back to base. Footage from the camera is saved to a memory card. Added to this is the concept of fault tolerance - what happens if (read: when) the GPS unit loses it's satellite lock and stops working? What if the micro controller reboots halfway through the flight? That last one has actually been known to happen with the RPi...

Launching a HAB requires liaising with local air traffic control and filing a flight plan with the CAA, although that's more of a flight intention. Average flight time is 2-3 hours, after which you can only hope that your payload doesn't land in too tall a tree. There's normally a tracking team in a car who are receiving regular updates from the balloon as it travels. Once the altitude is too low to use radio, communications switch to GSM for a more reliable link close to the ground.

Github: https://github.com/warwickrockets/MilePiClub

Gaming on the Go

Last week I decided to have a go at something that's been trialled commercially (and unsuccessfully) a few times - a game streaming service.

The reason that OnLive and others have failed was down to the speed of residential internet connections as opposed to how good the service actually was. While most homes in the UK probably have access to a good enough connection to stream 1080p video, the real problem is likely to be getting a latency of < 40ms to the nearest data centre (on ethernet, wifi need not apply). I wanted to have a go at this last year, but sharing an ADSL connection with 5 other students made it impossible. This year, however, we're practically sat on top of a fibre cabinet, with a 40ms ping to the Amazon data centre in Ireland :D

So, how does this actually work? My laptop may be incapable of running games, but it does have enough oomph to decode HD video (thankfully). This means that it's possible to rent out a sever with some beefy graphics hardware and run the game on that, streaming the output to my laptop and sending my key presses and mouse inputs back. An hour on one of Amazon's g2.2xlarge instances will set you back a whole 45p and nets you 8 CPUs and 15GB of memory as well as a decent GPU.

After spinning up a new node with Windows Server, the first step is to sort out the hardware. By default windows insists on using the basic display driver for graphical output, leaving you to utilise the supplied GRID GPU for whatever you please. Unfortunately, we want to utilise the GRID for graphical output, and Windows insists so hard that disabling and uninstalling the default driver isn't enough to convince it that you don't want it anywhere near your server. Once that's done you need to start the Windows audio service and set up a virtual sound card so you can get some audio output.

The first thing most people think of when you mention streaming desktop stuff is RDP or VNC. As we found out last term when we used VNC to cast the Artemis captain's screen to a projector, they are not designed for performance, and you won't get anywhere near 60 frames per second. Most likely, you'll be measuring in seconds per frame. The best tool for this is actually Steam 'in home streaming' - because the link from laptop to data centre is not that far off a home network. You do need to VPN the two hosts together to trick Steam into thinking that you are in fact in the same room as the server. In fact, getting steam to consistently recognise the link between the two workstations was the biggest problem, and stems from it's inability to correctly deal with more than one network adapter. Discovery packets are often sent out on the default adapter and not the simulated VPN one, resulting in frustration and sadness. People have reported similar issues when using hosts with VirtualBox adapters.

But even with all of that, it actually works surprisingly well. Well enough that I often forgot I was streaming. You won't get miracles though: 1080p is probably about the highest you'd want to go to avoid hitting performance and bandwidth limits. The hardware you get is a GRID K520, which is two GK104 boards glued together (a slightly underclocked GTX680). This is somewhere between the current GTX 970 and GTX 980 in terms of pixel pushing power. So, achieveable? Yes. Practical? Still not quite there yet - having a good enough connection available is still quite limiting.


20/10/2015

Mmm, Pizza

Every Friday we hold gaming nights in the labs at uni, and invariably pizza gets ordered at some point. This is great, but the traditional way of group purchasing by huddling round someone's laptop doesn't work when there are 50 of you.

Enter pizza-get. A lightweight web app to make bulk ordering pizza that little bit easier (projects where the problem domain includes knowledge of the dominoes menu are the best projects). It's been interesting to develop due to the fact that weekly gaming sessions mean that seven day mini sprints are the way to go. While that doesn't leave much time to get things planned, developed, and tested, it has made things so much easier with regards to prioritizing which features to implement next.

But for me, the really interesting part of the project is that it processes card payments. Coming from the 'what would be really cool' list of ideas (as opposed to the 'what I think I can implement' one), it's turned out to be a success. Offloading the actual nitty-gritty of the transactions to Stripe saves the hassle of having to protect sensitive information and verify transactions, and means that what seemed like the scariest part of development was by far the easiest. Outsourcing does introduce a fee, but it's worth it if we don't need to source exact change from 50 people.

I did consider writing the back end in java originally, but after deciding to keep things as light as possible settled on a PHP back end with bootstrap for the UI (themed with Flat UI). I'll probably wind up active development after this Friday, which would add up to three weeks of programming. Pretty short lived, but once it does what it needs to it seems better to move on and try something new.

GitHub: https://github.com/mcnutty26/pizza-get

Live: https://pizza.uwcs.co.uk/