Light Raider

Half a year after HackManchester I decided to give  a go to a Data-Based Hackaton... this time joining a team with friends. The result was the Light Raider, an Android app that encourages running among Mancunians by targeting lamps in the streets. It went really well as we won in "life quality" topic ... and I used tha tmoney to get me an Oculus Rift SDK 2 (but that is a different story) and we even got showcased by some of the local media.

The Hackaton was organized by  the Greater Manchester Data Synchronisation Programme (GMDS) They offered a very wide variety of open data from Greater Manchester's councils, and it was the "competitors" role to create new and interesting applications and services that are able to communicate with the datasets in order to create a more intelligent city.

Of all the datasets that we could choose we loved the ones that store the information of every street-lamp in a council... and as the topic: improve Mancunian's quality of life. After thinking hard and many discussions we decided to take one of my original ideas and mix it a bit: Light Raider.

Light Raider is an Android game that uses some concepts from Ingress, RunKeeper and Tamagochi to make users go out and run. In this game, the user has a pet (a light bulb) that is constantly demanding to be fed with energy, so every 1-2 days the user must go out and retrieve some energy in order to keep it alive... and here is where the game gets interesting. Once the player decides to go out running, the view changes to a countdown and the user must retrieve energy faster than it drains in order to fulfil the "batteries" of the light-bulb. To do so all street-lamps around the phone are retrieved from the council's data set and, every time the user pass by a new lamp, the energy levels go up a little bit.

Sounds easy? It is not. On running mode the light-bulb's battery drains quite fast and requires the user to actually run. But also the lamps visited by the user are remembered... which means he has to conquer new ones every time.

Still easy? Ok! It is not only about visiting new lamps. In the map the user can see which lamps have been"raided by other players already and he will need to reconquer his territory if he wants to score high in the general competition. If the pet dies (not getting energy for a few days) all the attached lamps are freed and the general score goes back to 0.

With this system for micro (pet energy) and macro (lamps raiding) "encouraging system" the user should have enough motivation to go out and do a bit of exercise hopefully!




The Wireless VR Experience

After leaving my VR Gun project for a while I decided to go to HackManchester 2013 and give it the push it deserves by creating not just the Gun, but a full VR experience. In 25 hours I managed to finish the weapon and modify an existing game named Angry Bots to be playable with all the freedom of a wireless system!

 I won the "Best Company Project" award from the jury, and also was the finalist (2nd) for the "Best of All" competition. A true honour that motivated me onto polishing and showcasing the project for the Manchester's Android Meetup months later with huge success :-)

Ok, so what is this exactly? Basically it is a set of applications to detect the user's movements to control and display a FPS version of the mentioned game. The user experiences a 3D environment that allows him not only to look around using stereoscopic vision, but walk, jump. crouch, aim... a full VR experience; and most importantly: in Real Time, without any cables and ultra-light, perfect for feeling deeply immersed.

All code can be found in my Github, I will explain here the key parts of the project, which are summarised in the presentation I used or the Android Meetup and can be found here.

The Gun:

The gun used is obviously my VR Gun. I coded the Arduino board  so I could actually track the on/off positions (only on first-call) for the trigger and the ammo clip modifying the original's Makey Makey code.

Through an OTG cable, the board is connected to a Galaxy S3 attached to the Gun itself running a very particular app. The app, inside DataStreamer folder, will listen to the Arduino output and also track the pose. The phone then has all the important information related to the weapon and can send it to the server (game). But not only that! Because the phone is in contact with the gun and knows when the trigger is pressed I also implemented some haptic feedback so, when the user fires the gun vibrates with a nice machine-gun rhythm.



Choosing the right phone is not as easy as it seems:

  • It needs to have not only OTG support but to be able to give it a 5V input.
  • The pose-detection relays strongly on the gyroscope sensor and, nowadays, it is quite difficult to find information about how good a phone's gyroscope is. I tried my best to correct any drifting using a version of this code, that brings accelerometer and compass to the mix in order to create a rock-solid pose reading, but can still be problematic in medium/low-end phones. For that reason I included a huge Sync button in the middle of the screen so it is impossible to miss it while playing and will realing the head, hip and head poses.


Galaxy S3 works wonderfully but still there are some scenarios where the user will have to hit the button every 5 minutes or so... until I code a proper solution (already found one but it has not been implemented yet, more in the last paragraph), and the 5V requested from the OTG makes the battery drain quite quicly (1h-2h).


The Movement:


For the movement I used a different phone running a pedometer, also bundled in the DataStreamer app, I created for my old Augmented Reality system. The important thing about this pedometer is that it not only listens to the strength of the steps but the rhythm so it is very resistant against noise, it could even run directly on the helmet! But instead I decided the user will put it in his back pocket, this way it can track the hip orientation and even the butt inclination.... this way the user could walk towards his hip and not his head and even crouch on real time or even lay down in the ground.

Because, as mentioned in the previous point, not all phones have a RT-compliant gyroscope, I decided to put a toggle button to disable the hip-butt pose detection and use the head tracking instead in case the phone is not powerful enough to keep the pose updated without breaking the immersion.


The Helmet:


This is the most important part. I created a helmet out of foam during the Hackaton (that later was substituted by a more professional-looking black helmet) that holds a 7" tabled (Nexus 7) and 2 lenses to the user's face. The code running here is in the VRUnity folder and contains a Unity3D project. It is the  Angry Bots demo game modified in two ways:

  1. The player has been removed and the camera replaced, after importing the Oculus Rift SDK, with a stereoscopic set of cameras that render the view with the perfect oval shape for the lenses and also track the head's pose very fast. Since OR is PC only, I had to modify the code a bit so it won't silently crash on my device. Specifically I commented out all the code related to DeviceControllerManager and the DeviceImposter.
  2. I included a communication system to allow the DataStreamer app to send data to the game. More in the next point.


The biggest task here was not only finding what was not working with the OR SDK (it was not crashing, it was actually working but my communication system was not.. and it was its fault) but also creating the helmet. There are a few small details to have in count:


  • It has to be closed and dark so the light does not distract the eyes.
  • The distance eyes-lenses-screen is very important and varies depending on the user. I ended up creating a couple of rails in my last helmet so it was adjustable.
  • Breathing is an issue. There has to be an aperture for the nose or the screen/lenses will be steamed in a few seconds.
  • It has to be light, but still avoid any kind of wobbling. 

The first version was not the best; but after many tries, super-glue, cutting and breaking elastic bands I built a black helmet following all those guidelines.


The Communication:


The way everything works together is thanks to some UDP magic. The DataStreamer apps will bundle the information in Datagram packages and send it to the server (game). Once the game receive the package, it has to parse it and redirect it to the relevant Gameobjects that will apply the information.

The key for having RT here was to use a port for each possible message (fire mode, hip pose, steps, gun pose, etc.) so there is a file defining a set of offsets for the port and each listener will apply it when sending the datagram. On the server side there is one thread running per port, each of these threads is looking for exactly one type of message and as soon as it is received it is processed.

What's next:

There are a few bits I am still not happy with and I will try to solve at some point.

  • I would love to take advantage of the current 3D printers and get a more professional helmet done. I am already talking to a few people and this might be happening soon! In that case I will put the model here.
  • Gun drifting. As I mentioned, after a few minutes the gun pose might have drifted a bit. If the tablet in the head had a camera I could set a few LEDs on the gun's nose, flickering with different patterns, and then track it directly from the game. This will even allow to move the gun in a complete 3D space (when in view). I still want the user to be able to fire backwards, in that case the normal gyroscope reading can be used and when it comes back to the front view the drifting (if any) can be corrected again.
  • Jump! I already did a few experiments, but it will be interesting to detect jumps using the pedometer code. So far the results make me feel very dizzy after a couple of tries.
  • Use a more professional gun, maybe modify an electronic air-soft gun so the weight and controls feel more real.

The virtual reality Gun controller

A lot of things have happened since the last update. Being the most important one that I achieve to get a job in the augmented reality industry! This has keep me very busy for the last year, but I have effectively managed to learn a lot in the field and also Android, iOS, Unity3D and many... many more. I love it!
Now that my "learning" curve has eased out I finally got time to carry on in my personal projects / hacks. And this time I promise I will put some proper code and tutorials (by the way I have enabled the comments so feel free to comment code in the older entries).

So this first new entry is related to the fact that I just received an Oculus Rift. This HMD is called to push the VR back in the trenches and I wanted to try it first hand... It promises real-time head tracking and perfect 3D vision of the environment with an epic field-of-view. You can move around the scenarios and after 3 minutes you truly believe you are inside. But head tracking is not everything in Virtual Reality and there is still some edges to cut: one of those is solving the problem of "moving around" with some kind of localization and I will talk about that one at some point; but the interesting one today is interacting with the environment, and when I say interacting I mean shooting at things... time for some Virtual Reality FPS.

I want to create a Gun that I could use in a shooter game, kind of the one Nathan has created for Half-Life 2 here. But that gun has some drawbacks (economic drawbacks because he is buyin a custom weapon and a custom tracker) and I want something cheap and nice and custom and hacky and ... and ... and I want it now. So this and the next posts will talk about creating your very own Gun Controller with tracking and everything for playing in virtual worlds!


Chapter 1: Ripping apart a Nerf Gun and giving it an Arduino brain.


Lets start from the beginning. In this first chapter I will talk about how I opened a plastic gun and closed a electronic one. I will cover how I made the connections from plastic to Arduino inputs and by the end the result will be a gun being able to fire, reload and select alternative fire. Lights/vibration and tracking will come in the next chapters.

Materials:


The gun: I used a Nerf Gun, the model "Alpha Trooper" have a lot of space inside to maneuver and also looks bad-ass. Actually I bought it to shot foam darts at people at the office... and yes we painted them and everything.

The brain: I have got a Makey Makey, which is basically an Arduino Leonardo. I bought it just to have fun until this project came to my mind! This is not supposed to be a Makey Makey / Arduino tutorial so go to the official forums in case of doubt.

The guts: Some cables, some springs, a solder, a saw, screwdriver and aluminium tape!



Step 1: Unscrew everything


With a star-screwdriver and patience I just simply removed al the screws. Once done, very carefully I removed and keep al the moving-pieces of the gun as you can see in the photo.
Dismounted Alpha Trooper
I enumerated the pieces and I will use this photo as reference over and over again. Always keeping all the springs I disposed the pieces 13,14 and 15.


Step 2: Making space for the board


As you may see in the top-right corner of the previous photo, the Makey Makey is not very big but I still have to make some room. The picked place is exactly the center, on top if the magazine, but that position will give a couple of problems.

"resized" clip
When the ammunition clip is in, it takes almost all the space... so I will cut it with a saw! Always saving the gap in the side that keeps it hooked to the gun.

This picture compares the resize clip to the board. I also removed all the inner parts such as the springs and the platform that you may see in the previous photo.






"resized" piece1
The other big problem is the piece 1. This piece
moves using the side rail and when in "firing" mode takes all the space in the center in order to load the bullet. When cutting it it is important to note that I still needed the 2 hangers for the rails on each side of the gun. And I still wanted to be able to move the piece 4 for realism. So the outcome is cutting to something like this.









A 1cm cut to serve as slot for the board...

Fine! Now I should be able to put the Makey Makey inside, but I still want to be able to remove it sometimes and It must float inside the gun so it does not interfere with the piece 1 moving rails.
I decided to cut a small slot with a saw where the end of the clip should rest. The next photo will highlight this.
Cutting in half the piece 11
Last, the piece 11 which serves as chamber for the foam bullet is also in the way. Show no mercy removing the right side of it. Always keeping in the "right" side the two small legs to attach it to the body.












And now it the gap is big enough the Makey Makey should fit inside and don't move at all, it is important to note that the piece 1 must be in first, and be sure that its movement is not blocked by the board.
The Arduino board inside the Gun with all pieces


Step 3: Making the triggers talk


The real fun starts now. With some aluminium tape and cables the trigger (piece 3), pump system(piece 2) and the magazine should inform to the Makey Makey of their state.

To do so I followed this principle: moving part should now have any cables but being used to close circuits "printed" in the fixed pieces. How does this apply? Easier than it sounds, simply put aluminium tape in the moving parts that, when in on position will touch other 2 aluminium pieces connected to the board. Let's see some close ups.

Pump system circuit
This is the edge of the piece 2. Some aluminium tape in the orange part that moves as the very-nice-orange-arrow indicates will connect the red and green areas. It is important to note that the tape must be flat enough to avoid giving too much friction to the moving part. Note how instead of putting directly the cable I allow a big and separated area in the "tape circuit". This cables should go to ground and a input pin in the board later.







Trigger system circuit
The trigger system works with the same principle but looks sightly different. In top of the trigger piece there are 2 little plastic steps that prevent the moving part to move up and down. With the happy outcome that the second step is only touched by the trigger when pressed! Some tape as in the photo should do it, always remembering to put some tape in the top of the trigger (will be visible in the next photos) and connecting each cable to ground and an input pin in the board.









Reload system circuit
And finally the ammo clip. This one is a bit more trickier, after struggling about how to close a circuit with a piece that feels always so fragile (it moves a lot inside the cabin) I have a great idea while drying my clothes: clothes peg springs.

Two hard metal springs glued strategically in the gun side, so they block partially the clip entrance, but still holds will do it. Always putting some aluminium tape in the clip so it connects both springs which have cables going to the board as always.




Flap to remove in the left side


The final bits include guiding the cables from the tape areas to the board. The Alpha Trooper has enough room behind the trigger to do so. But on the other side of the gun a little flap must be removed.

Removing it will allow some space to guide all the cables through when closing it.













And the very final thing is the USB cable, this step is a bit special, eventually this mod will be wireless with the help of an android phone, but for now an USB connection to the PC is needed. The default cable that comes with the board is a normal mini-usb that won't fit properly between the board connector and the piece 6 so it is important to get a mini-usb with the mini-head output rotated 90 degrees. This is not easy to find in stores as they come customised directly with some cameras. Lucky me a friend gave me one.

Path for the USB cable

The second part is guiding it out of the Gun, the cable goes the same way behind the trigger but needs to come out from the bottom. With the help of the solder or the saw to make some holes in the plastic the USB should go thought the piece 12, then under the trigger spring, under the trigger connection and finally go over the removed flap in the last step.

In the photo I left in red the places where I made a hole, blue for the x-ray vision on the cable and green for highlighting where I removed the flap (in the other side). Please note that here I am not using the 90 degrees cable but the normal one.











Step 4: Test and close


It is time to seal this mess and give it a try. For doing all that is needed is to simply connect the USB cable to the PC and check if and only if one of the circuits is closed the LED in the board goes green. Pro tip: change the current springs in the gun with those removed from previous pieces in order to have stronger springs in the modified parts.
It is alive!
Now turn to put all the items in and put back the screws.
Final look opened
Final Look Closed

















Next Step: Programming and LEDs


My next step in the agenda is to do some custom programming to the current Makey Makey script in order to fire the keys for the clip and pump just once when the clip is removed or the or the lever pressed. And also add some AND gates to the trigger so it only shots if there is a clip in possition. 
I would also love to add some funny LEDs arrays behind the piece 7 and animate them when doing actions so stay tuned.

After that another post should come with some info about how to make this all wireless and do some tracking using the gyroscope of an Android phone.

Applications for the AR system



One of the advantages of the AR system developed is the ability to couple the logic of a video game. At the same time I was coding the system, my friend Carlos Torija was designing a video game, he created the artificial intelligences and logic and then we both added simple graphics with openGL and gave it some AR. In this game you have to evade/attack an evil drone that follows and tries to kill you. The game has been produced to be played in a open space and it has virtual walls! Next step includes map recognition.




I also started another AR game using my system. I planned to release it for bada 2.0 but Samsung keeps delaying it so the game is unfinished. The game is an augmented stunt kite simulator, at the moment it has really simple physics and fixed wind but I plan to add a wind system using weather forecast and advanced physics in order to perform realistic tricks.


Note: Pink artifacts appears when taking a screenshot and are not present in the real application. 

NDS experiments

I know that Nintendo DS is old hardware now, but back in my time it was awesome! One day I discovered PAlib for NDS and I decided to investigate. I came up with some weird ideas, from a back-scratching game to a portable version of the "Shelters of Catan" board game (now it do exists for NDS but it is not my unfinished-experimental version). I also started to program a time-based multiplayer gymkhana game for various teams so they can play in a specific forest, and the clues had to be solved using the NDS.


These projects taught me how hard is to create a game without a real graphical artist in the team, but I also found that seeing your results in a video-console transmits and amazing feeling.

Note: the scratching game is about...that. With difficulty from stinting to herpes and a punctuation system!

The home-made VR system.

During my holidays I was growing quite fat due to inactivity and I were spending to much time playing Minecraft. Here at home I have got a dance mat for DDR and those rhythm games but I don't really like them, so I decided to create something funny, healthy and a bit geek: a home-made virtual reality system!

I found GlovePIE, an input emulator where you can create scripts to remap almost any controller, from dance mats to wiimotes. And yes, I have got both of them. I created my first script for Minecraft where you can walk using the dance mat (1 step in real world == 1 step in Minecraft) in a quite realistic way (it is not about pressing one single button, but walking in a natural way) and also jump (the character jumps when you release both feet from the ground so it is almost 1:1). For digging and gather wood you have to shake the Wiimote as if you were using a pickaxe and for putting blocks you have to move the Nunchuk.

video

Then I started to play Skyrim and so I gained weight again, that's why I remade it! Now it is possible to detect when the user is sprinting, and the script also implements different controls for each weapon: when you are using a sword you need to swing the Wiimote horizontally to perform a light attack and vertically for a strong attack; you can get some cover with your shield raising the Nunchuk and shake it from here to push things with it; you have to raise your hands for controlling spells and, most important, you can use your voice. GlovePIE can easily detect voice commands and in Skyrim this means that it is not only possible to make simple orders like "save the game" or "equip the bow" but you can also shout! So if you want the character to shout the spell "FUS ROH DAH" all you need is to shout it!
video
I forgot to show how to hit things with the bow shaking the Nunchuk :-(, also voice commands are in spanish.

Who needs Kinect?


Augmented Reality System


After 5 years studying it comes time for the dissertation. You can choose one of the many subjects proposed by the professors , but I decided to go on my own creating a whole augmented reality system.

 The original idea consisted in a program that should work in open environments, without markers, trying to naturally fuse the virtual and the real world. In fact those "open environments" were a problem, because giving the user so much freedom can result in  low performance visualization when using a smartphone. This was the key point and I wanted to success where other programs such as Layar (poor integration) or AiRaid (lack of freedom) failed.

The system was developed trough a year using C++ and OpenGL for bada smartphones and integrates some new ideas that makes it excels over other AR systems:



  • Efficient usage of sensors to obtain a sound registration. This allows the program to work faster in open environments and to avoid possible measurement errors faster.
  • Real lightning system. Using GPS-based weather information and sun tracking algorithms in order to get a realistic integration.



video
In this video the sun movement has been accelerated and the perspective translated for didactical purposes.
  • GPS Hybrid. The system can respond correctly even in extreme scenarios, where no accurate GPS data is given, thanks to the advanced pedometer developed.
  • Game's logic integration made easy. Programming an augmented video-game is really easy thanks to the well designed system architecture.

video
This example recreates the worst-case scenario where no GPS signal is received.

For extra information, an explanatory part of the dissertation and many interesting papers about AR can be found here, but I am afraid it is in Spanish. 

Note:Pink artifacts appears when taking a screenshot and are not present in the real application.

The Augmented Agenda

Augmented Reality is probably one of the most eye-catching technologies nowadays. I still remember how I started to dream about it years before it became an emerging technology. Much time later I was able to do a complete dissertation about it (I will talk about this in another section).

My very first try with AR supposed the creation of a weird idea I had long time ago: some day AR and social networks will work completely together and people will be able to see personal information about me just pointing at me with their phones. This terrible future is too close and I decided it was possible to create an App that explains the idea.


My little experiment puts together AR and social information in an Augmented Agenda. Created for bada phones (C++), the app is capable of detecting faces of your contacts and show their names, mobile phones or e-mails. It also includes a drawing program to create masks for the faces, so you can easily remember who is each person by enhancing their image with funny glasses (or hats, or scars...). It is very useful to recognize those people you met long time ago and whose names you can't remember without the embarrassment of asking.  


Oink! Mobile version

Oink! is a Spanish blog created in 2001 with no more tools than the keyboard and the brain of Mr.Oink. In there, he links dozens of webpages everyday to amuse the souls of those bored in their offices, from crazy games to weird shops, you can find anything. It has thousand of daily readers and many mentions on the media, so in 2011 Mr.Oink decides it was time to create a mobile version. I have followed this blog for years and once I had finished my studies I wanted to help him releasing Oink! Mobile.


 Using PHP, MySQL and jQueryMobile, it took me a few days to program a nice looking comments section and to enhance the old structure including some clean, and always elegant, regular expressions. Now the mobile version is still under development, but hundreds of people are enjoying it from their beds, the bus... or some other places, aehem.

Minekhaan

Minecraft is an impressive and extremely funny game, and I really love it. So when I decided to start learning OpenGL this game was my model: a simple, but huge, game.



MineKhaan is just an experiment programmed from scratch using OpenGL and C to create custom worlds using big cubes. You can walk and jump, create or erase cubes with different textures and  moreover you can select figures (a tree, a house or some grass) and copy-paste/store-save them! So you can construct your world in a faster and friendlier way!