“From Paper to Pixels” interview featured on Intel blog

September 5th, 2013 by Adam Doucette

Recently our own Rebecca Allen, CEO and Creative Director at Infrared5, was featured in an interview by Wendy Boswell of Intel to discuss perceptual computing and the upcoming From Paper to Pixels event. Also featured in the post are two of the artists participating in this year’s event Anna Kristina Goransson and Rob Gonsalves. Follow the link and stay tuned for more details regarding the event, which will be held on September 21 & 22 here at the Infrared5 offices from the hours of 11:00 – 6:00.

Check out what Rebecca had to say!

, , ,

START 2013 – A Conference Not to Miss

March 19th, 2013 by Rebecca Allen

Last Thursday, Chris Allen (one of my business partners and husband) and I headed on a train to New York City for the first inaugural conference called Start. We were one of 23 startups invited to show off our product, Brass Monkey, to the highly curated group of 500 attendees. Hands down, it has to be one of the best events I have ever attended. From the moment we arrived at  Centre 548 in Chelsea at 7:30am Friday morning until we left at 6:30pm that evening, it was one great conversation after another. Paddy Cosgrave and his amazing team of organizers at f.ounders did an outstanding job. We were honored to be selected as an exhibitor and excited to be amongst such innovative products and applications. Here are a few of my favorites: LittleBits , 3Doodler, BrandYourself and Magisto. LittleBits is an open source library of electronic modules that snap together with magnets for prototyping, learning and fun. Such a cool product that totally hits on so much that we love: open source technology, education, fun and creativity!

Since Chris and I were managing our booth, we were unable to attend the round tables and talks that happened throughout the day. We are excited that the talks were recorded, and Chris and I will be spending some quality time going through all of this great content. We had a fabulous day and would recommend to anyone that’s into Startups to attend Start 2014 when it comes around next year. I look forward to making it to WebSummit, f.ounders other event in the fall. Dublin, here we come!

, , , , ,

Plan of Attack and Face Tracking Challenges Using the Intel Perceptual Computing SDK

February 19th, 2013 by Chris Allen

This post was featured on Intel Software’s blog, in conjunction with Intel’s Ultimate Coder Challenge. Keep checking back to read our latest updates!

We are very excited to be working with the Intel Perceptual Computing (IPC) SDK and to be a part of the Ultimate Coder Challenge! The new hardware and software that Intel and its partners have created allows for some very exciting possibilities. It’s our goal to really push the boundaries of what’s possible using the technology. We believe that perceptual computing plays a huge role in the future of human-to-computer interaction, and isn’t just a gimmick shown in movies like Minority Report. We hope to prove out some of the ways that it can actually improve the user experience with the game that we are producing for the competition.

Before we begin with the bulk of this post, we should cover a little bit on the makeup of our team and the roles that each of us play on the project. Unlike many of the teams in the competition, we aren’t a one man show, so each of our members play a vital role in creating the ultimate application. Here’s a quick rundown of our team:

Kelly Wallick – Project Manager

TECH
Chris Allen – Producer, Architect and Game Designer
Steff Kelsey – Tech Lead, Engineer focusing on the Intel Perceptual Computing SDK inputs
John Grden – Game Developer focusing on the Game-play

ART
Rebecca Allen – Creative Director
Aaron Artessa – Art Director, Lead Artist, Characters, effects, etc.
Elena Ainley – Environment Artist and Production Art

When we first heard about the idea of the competition we started thinking about ways that we could incorporate our technology (Brass Monkey) with the new 3D image tracking inputs that Intel is making accessible to developers. Most of the examples being shown with the Perceptual Computing SDK focus on hand gestures, and we wanted to take on something a bit different. After much deliberation we arrived at the idea of using the phone as a tilt-based game controller input, and using head and eye tracking to create a truly immersive experience. We strongly believe that this combination will make for some very fun game play.

Our art team was also determined not to make the standard 3D FPS shoot-em-up game that we’ve seen so many times before, so we arrived at a very creative use of the tech with a wonderful background story of a young New Zealand Kiwi bird taking revenge on the evil cats that killed his family. To really show off the concept of head-tracking and peering around items in the world, we decided on a paper cutout art style. Note that this blog post and the other posts we will be doing on the Ultimate Coder site are really focused on the technical challenges and processes we are taking, and much less on the art and game design aspects of the project. After all, the competition is call the Ultimate Coder, not the Ultimate Designer. If you are interested in the art and design of our project, and we hope that you are, then you should follow our posts on our company’s blogs that will be covering much more of those details. We will be sure to reference these posts on every blog post here as well so that you can find out more about the entire process we are undertaking.

The name of the game that we’ve come up with for the competition is called Kiwi Catapult Revenge.

So with that, let’s get right to the technical nitty gritty.

Overview of the Technology We are Using

Unity

As we wanted to make a 3D game for the competition we decided to use Unity as our platform of choice. This tool allows for fast prototyping, ease of art integration and much more. We are also well versed in using Unity for a variety of projects at Infrared5, and our Brass Monkey SDK support for it is very polished.

Brass Monkey

We figured that one of our unique advantages in the competition would be to make use of the technology that we created. Brass Monkey SDK for Unity allows us to turn the player’s smartphone into a game controller for Unity games. We can leverage the accelerometers, gyroscopes and touch screens of the device as another form of input to the game. In this case, we want to allow for steering your Kiwi bird through the environment using tilt, and allow firing and control of the speed via the touch screen on the player’s phone.

Intel Perceptual Computing SDK

We decided to leverage the ICP SDK for head tracking, face recognition and possibly more. In the case of Kiwi Catapult Revenge the payer will use his eyes for aiming (the player character can shoot lasers from his eyes). The environment will also shift according to the angle in which the user is viewing causing the scene to feel like real 3D. Take a look at this example using a Wiimote for a similar effect that we are going for. In addition, our player can breath fire by opening his or her mouth in the shape of an “o” and pressing the fire button on the phone.

There are certainly other aspects of the SDK we hope to leverage, but we will leave those for later posts.

OpenCV

We are going to use this C-based library for more refined face tracking algorithms. Read more to find out why we chose OpenCV(opencv.org) to work in conjunction with the IPC SDK. Luckily, OpenCV is also developed by Intel, so hopefully that gets us additional points for using two of Intel’s libraries.

Head Tracking

The biggest risk item in our project is getting head tracking that performs well enough to be a smooth experience in game play, so we’ve decided to tackle this early on.

When we first started looking at the examples that shipped with the IPC SDK there were very few dealing with head tracking. In fact it was really only in the latest release where we found anything that was even close to what we proposed to build. That, and it was in this release that they exposed these features to the Unity version of the SDK. What we found are examples that simply don’t perform very well. They are slow, not all that accurate, and unfortunately just won’t cut it for the experience we are shooting for.

To make matters worse, the plugin for Unity is very limited. It didn’t allow us to manipulate much, if anything, with regards to head tracking or face recognition algorithms. As a Unity developer you either have to accept the poor performing pre-canned versions of the algorithms the SDK exposes, or get the raw data from the camera and do all the calculations yourself. What we found is that face tracking with what they provide gives us sub 3 frame per second performance that wasn’t very accurate. Now to be clear, the hand gesture features are really very polished, and work well in Unity.  It seems that Intel’s priority has been on those features, and head/face detection is lagging very much behind. This presents a real problem for our vision of the game, and we quickly realized that we were going to have to go about it differently if we were going to continue with our idea.

OpenCV

When we realized the current version of the IPC SDK wasn’t going to cut it by itself, we started looking into alternatives. Chris had done some study of OpenCV (CV stands for computer vision) a while back, and he had a book on the subject. He suggested that we take a look at that library to see if anyone else had written more effective head and eye tracking algorithms using that tool-set. We also discovered what looked like a very polished and effective head tracking library called OpenTL . We got very excited with what we saw, but when we went to download the library, we discovered the OpenTL isn’t so open after all. It’s not actually open source software, and we didn’t want to get involved with licensing a 3rd party tool for the competition. Likewise the FaceAPI from SeeingMachines looked very promising, but it also carried with it a proprietary license in order for us to use it.  Luckily what we found using OpenCV appeared to be more than capable of doing the job.

Since OpenCV is a C library we needed to figure out how to get it to work within Unity. We knew that we would need to compile a dll that would expose the functions to the Mono based Unity environment, or find a version out on the Internet that had already done this. Luckily we found this example, and incorporated it into our plans.

Use of the Depth Camera

The other concern we had was that all the examples we saw of face tracking in real-time didn’t make use of any special camera. They all used a simple webcam, and we really wanted to leverage the unique hardware that Intel provided us for the challenge. One subtle thing that we noticed with most of the examples we saw was they performed way better with the person in front of a solid background. The less noise the image had the better it would perform. So, we thought, why not use the depth sensor to block out anything behind the user’s head, essentially guaranteeing less noise in our images being processed regardless of what’s behind our player. This would be a huge performance boost over traditional webcams!

Application Flow and Architecture

After carefully considering our tools we finally settled on an architecture that spelled out how all the pieces would work together. We would use the Unity IPC SDK for the camera frames as raw images, and for getting the depth sensor data to block out only the portions of the image that had the person’s head. We would then leverage OpenCV for face tracking algorithms via a plugin to Unity.

We will be experimenting with a few different combinations of algorithms until we find something that give us the performance we need to implement as a game controller and (hopefully) also satisfy the desired feature set of tracking the head position and rotation, identifying if the mouth is open or closed, and tracking the gaze direction of the eyes.  Each step in the process is done to set up the the steps that follow.

In order to detect the general location of the face, we propose to use the Viola-Jones detection method.  The result of this method will be a smaller region of interest (ROI) for mouth and eye detection algorithms to sort through.

There are few proposed methods to track the facial features and solve for the rotation of the head.  The first method is to take use the results from the first pass to define 3 new ROIs and to search specifically for the mouth and the eyes using sets of comparative images designed specifically for the task.  The second method is to use the Active Appearance Model (AAM) to find match a shape model of facial features in the region.  We will go into more detail about these methods in future posts after we attempt them.

Tracking the gaze direction will be done by examining the ROI for each eye and determining the location of the iris and pupil by the Adaptive EigenEye method.

Trackable points will be constrained with Lucas-Kanade optical flow.  The optical flow compares the previous frame with the current one and finds the most likely locations of tracked points using a least squares estimation.

Summing it Up

We believe that we’ve come up with an approach that leverages the unique capabilities of the Perceptual Computing Camera and actually adds to the user experience of our game concept. As we start in on the development it’s going to be interesting to see how much this changes over the next seven weeks. We already have several questions about how it’s going to go: How much did we think would actually work will? What performance tuning will we need to do? How many of the other features of the IPC SDK can we leverage to make our game more engaging? Will we have enough time to pull off such an ambitious project in such a short time frame?

Wow! That turned out to be a long post! Thanks for taking the time to read what we are up to.

We are also curious to hear from you, other developers out there. What would you do differently given our goals for the project? If you’ve got experience with computer vision algorithms, or even just want to chime in with your support, we would love to hear from you!

, , , , , , , , , , , ,

Seven weeks. Seven teams. ONE ULTIMATE APP!

February 6th, 2013 by Rosie

Infrared5 and Brass Monkey are excited to announce their participation in Intel Software’s Ultimate Coder Challenge, ‘Going Perceptual’! The IR5/Brass Monkey team, along with six other teams from across the globe, will be competing in this seven week challenge to build the ultimate app. The teams will be using the latest Ultrabook convertible hardware, along with the Intel Perceptual Computing SDK and camera to build the prototype. The competitors range from large teams to individual developers, and each will take a unique approach to the challenge. The question will be which team or individual can execute their vision with the most success under such strict time constraints?

Here at Infrared5/Brass Monkey headquarters, we have our heads in the clouds and our noses to the grindstone. We are dreaming big, hoping to create a game that will take user experience to the next level. We are combining game play experiences like those available on Nintendo’s Wii U and Microsoft Kinect. The team will use the Intel Perceptual Computing SDK for head tracking, which will allow the player to essentially peer into the tablet/laptop screen like a window. The 3D world will change as the player moves his head. We’ve seen other experiments that do this with other technology and think it is really remarkable. This one using Wii-motes by Johnny Lee is one of the most famous. Our team will be exploring this effect and other uses of the Intel Perceptual Computing SDK combined with the Brass Monkey’s SDK (using a smartphone as a controller) to create a cutting edge, immersive experience. Not only that, but our creative team is coming up with all original IP to showcase the work.

Intel will feature documentation of the ups and downs of this process for each team, beginning February 15th. We will be posting weekly on our progress, sharing details about the code we are writing, and pointing out the challenges we face along the way. Be sure to check back here as the contest gets under way.

What would you build if you were in the competition? Let us know if you have creative ideas on how to use this technology; we would love to hear them.

We would like to thank Intel for this wonderful opportunity and wish our competitors the best of luck! Game on!

, , , , , , , , , , ,

From Pixel to Paper – The Story of A Mural

September 10th, 2012 by Rosie

Last month, Infrared5 unveiled something exciting- an 11 foot high custom designed mural in our entry space. From conception to completion, spanning almost an entire year, this project was a labor of love. The work was designed by LA based artist, Bradley Munkowitz. Rebecca Allen, Infrared5’s CEO and the fearless leader of this project, met Bradley at a FITC conference many years ago where Bradley made a lasting impression. “His work is just as engaging as his personality,” says Allen. When it became time to look for an artist to create a mural for our entryway, Rebecca knew just the person to call. “I was initially taken by the scale of the piece; being 11 feet square… So I wanted to create an artwork that had a great deal of dimensionality, because on that grand scale it’d feel immersive, which would really make for a captivating entryway mural,” says Bradley. The artist went about creating a series of images digitally, using Autodesk Maya and procedural textures, which allow for rendering at any size. “I think I submitted about 20 different designs and Rebecca and I chose the best one for the application.” If you feel like our entryway is playing a trick on your eyes, you are right- Bradley is heavily influenced by Op Art. “I just love the visual movement, the graphic nature, and obviously the trippy dimensionality,” says Bradley.

one of several options created by Bradley Munkowitz one of several options created by Bradley Munkowitz one of several options created by Bradley Munkowitz

The task of getting this mural hung fell onto my plate sometime in late spring, 2012. I was new to Infrared5, still figuring out what my position here really meant, when Rebecca asked me to look into having someone come to hang the mural. I’ve come to think of myself as the resident ‘figure-it-out-ologist’.  Much of my job entails putting the time and focus into getting things done that take a lot of research time, tasks that have historically been put on the back burner in favor of focus on client work. I try to adapt IR5’s motto ‘yeah, we can build that’ into my own ‘yeah, I can research that’ in order to get things done.
I knew that finding someone that had the skill to hang this mural was going to be a tricky task. After all, the mural was shipped to us as three panels, 11 feet tall by roughly 3.5 feet wide. These panels have a back that peels off to reveal an adhesive that would stick to the wall. As our project manager Kelly Wallick said, “Its hard enough to put contact paper into drawers!”
I hit the internet. Companies that specialized in large scale vinyl installation wouldn’t install anything that they had not produced, and finding contractors proved a challenge. After weeks of telephone tag, multiple early saturday morning phone calls from one terrifyingly overeager applicant, and many frustrating stops and starts, we found our way to Abigail Newbold.
Abigail, a fellow MassArt graduate, is an artist who creates installations that confront ideas of comfort and survival.  On her website, Newbold states “I am motivated in my quest to evaluate and distill by a desire to be able to feel at home anywhere.” We were referred to Newbold via her coworker at the Decordova, and she proved to have just the patience and attention to detail to take on this project.
Abigail and her partner Ricky Marsee arrived at 10 am Sunday morning to begin work. Stunningly confident in the face of a meticulous task, the duo set to work. It was nearly 7 pm when they wrapped up. It had been a long battle, but the mural was finally up.

Taking almost a year from conception to execution, Infrared5 is thrilled to be displaying a mural as dynamic and contemporary as the work we hope to put out. No one is more excited to have the mural up than Rebecca. “I am so glad that we all persisted and have an amazing piece to enjoy and set the tone for clients!”

, , , , , ,