Gaming Ouroboros at the Global Game Jam 2012

February 6th, 2012 by Elliott Mitchell

Now and then, as a professional 3D technical artist and game designer, I find it’s helpful to step out of my usual routine and make a game over a weekend. Why? Because it keeps life fresh and exciting while providing a rare sense of instant gratification in the crazy world of video game development. Making a video game over a weekend isn’t easy for one person alone. For this, Global Game Jam was created.

This year’s Global Game Jam was held last January 27 – 29, 2012. I registered with was the Singapore-MIT GAMBIT Game Lab, in Cambridge, Massachusetts. Here is the lowdown of my experience.

Global Game Jam 2012 - Photo Courtesy Michael Carriere

The Global Game Jam (GGJ) is an annual International Game Developer Association (IGDA) game creation event. The event unites people from across the globe to make games in under 48 hours. Anyone is welcome to participate in the game jam. Jammers range from industry professionals to hobbyists and students. The primary framework is that under common constraints, each team completes a game, without preconceived ideas or preformed teams, in under 48 hours. This is intended to encourage creativity, experimentation and collaboration resulting in small but innovative games. To support this endeavor, schools, businesses and organizations volunteer to serve as official host sites. Several prominent sponsors such as Loot Drop, Autodesk, Microsoft and Brass Monkey also helped foot the bill.

HOW IT WENT DOWN

Keynote -

Brenda Brathwaite and John Romero addressing the Global Game Jammers 2012 - Photo courtesy Michael Carriere

GGJ site facilitators kicked off the Jam with a pre-recorded video from the IGDA website titled How to Build A Game in Less Than 48 Hours. The speakers in the video were Gordon Bellamy, the  Executive Director of the IGDA, John Romero (Quake) and Brenda Brathwaite (Wizardry) both co-founders of Loot Drop, Gonzalo Frasca (Ludology.org) the co-founder of Powerful Robot Games and Will Wright (The Simms) co-founder of Maxis. They speakers all gave excellent advice on creativity, leadership, scope and collaboration within a game jam.

Global Constraint -

Ouroboros

Our primary constraint was revealed after the keynote video. It was an image of a snake eating it’s own tail. The snake represented Ouroboros, a Greek mythological immortal. Variations of the symbol span across time and space from the modern day back to antiquity. The snake, or dragon in some instances, while eating it’s own tail has made appearances in ancient Egypt, Greece, India, Mexico, West Africa, Europe, South America and elsewhere under a host of names. It’s meaning can be interpreted as opposites merging in an a unifying act of cyclical creation and destruction, immortal for eternity. To alchemists the Ouroboros symbolized the Philosopher’s Stone.

Group Brainstorming –

Brainstorming Global Game Jam 2012

After the keynote game jammers arbitrarily split into 5 or 6 groups of 11 or so and went into different labs to brainstorm Ouroboros game pitches. After an amusing ricochet of thoughts, references, revisions, personalities and passions each room crafted 6 pitches which were mostly within the scope of the 48 hour Game Jam.

Pitch and Choose -

When the groups reassembled into the main room it was time to pitch.

The Rules-

  • Pitches needed to be under a minute
  • Title is 3 words or less
  • Theme related to the Ouroboros
  • The person pitching a game did not necessarily need to be on that potential team

There were about 30 or so pitches, after which each jammer had to choose a role on a game / team that appealed to them. Each Jammer had a single piece of colored coded paper with their name, skill level and intended role.

The Roles-

Choose Your Team - Global Game Jam 2012- Photo courtesy Michael Carriere

  • Programmer
  • Artist
  • Game Design
  • Audio
  • Producer

Games with too many team members were pruned and others lacking members for roles such as programmer were either augmented or eliminated. Eventually semi-balanced teams of 4-6 members were formed around the 11 most popular pitches.

My team decided to develop our game for the Commodore 64 computer using Ethan Fenn’s Comma8 framework. We thought the game narrative and technology married well.

Time to Jam - Photo Courtesy Michael Carriere

Time to Jam -

Post team formation, clusters of lab space were claimed. Even though most of us also brought our personal laptops, the labs were stocked with sweet dual boot Windows 7 & OS X systems with cinema displays. The lab computers were pre-installed with industry standard software such as Unity3d, Maya, Photoshop… We were also provided peripherals such as stylus tablets and keyboards. Ironically, I was most excited by the real world prototyping materials like blocks and graph paper which were also provided by or host.

First Things First –

Our space at Global Game Jam 2012 at Singapore - MIT GAMBIT Game Lab

After claiming a lab with another awesome team we immediately setup:

  • Version control (SVN)
  • Installed custom tools for Comma8 (Python, Java, Spite Pad, Tiles and more)
  • Confirmed the initial scope of the game
  • Set up collaborative project management system with a team Google Group and Google Doc

Cut That Out –

We needed to refine the scope once we were all aware of all the technical limitations such as:

  • Commodore 64 from 1982 is old
  • 64 kb of RAM for system not much
  • 8 bit
  • Programed in Assembly Language
  • 300 X 200 pixels
  • 16 pre-determined crappy colors
  • 3 Oscillators
  • Rectangular pixels
  • Screen Space
  • Developing in emulation on a network
  • Loading and testing a playable on legacy Commodore 64 hardware
  • Less than 48 hours to get it all working
  • Our scope was too big, too many levels
  • Other factors causing us to consider limiting the scope further included:
  • None of us had made games for C 64 before
  • Comma8 is an experimental engine that was untested in a game jam situation and is currently in development by Ethan
  • Tools such as Sprite Pad and Tiles are very archaic and limiting apps for art creation
  • Build process would do strange things to art after build time which required constant iteration

Rapid Iterative Prototyping -

Walking Backwards Prototype Global Game Jam 2012 - Photo Courtesy Michael Carriere

Physical prototyping was employed to reduce the scope before we went too far down any rabbit holes. We used the following materials to prototype:

  • Glass white board
  • Markers
  • Masking tape on the walls
  • Paper notes tacked to the walls
  • Graph paper
  • Wooden blocks
  • Pens

Results of Physical Prototyping-

  • Cut down scope from 9 levels to 5 levels as the minimum to carry the Ouroboros circular theme of our narrative far enough
  • Nailed the key mechanics
  • Refined the narrative
  • Determined scale and placement of graphical elements
  • Limited overall scope

Naturally we ran into design roadblocks and need to revise and adapt a few times. Physical prototyping once again sped up that process and move us along to completion.

QA-

Global Game Jam 2012 - Photo Courtesy Michael Carriere

We enlisted a few play testers on the second night and final hours of the game jam to help us gauge the following:

  • Playability
  • Comprehension of the narrative
  • Recognition of the lo-res art assets
  • Overall player experiences
  • Feelings about the game
  • Suggestions
  • Bugs

We did wind up having to revise the art, level design and narrative slightly to reach a better balance and game after play testing.

Deadline -

Walking Backwards - C64 - Global Game Jam 2012

1.5 hours before the game jam was to end it was pencilsdown. Time to upload to the IDGA Global Game Jam website, any other host servers and on to the site presentation computer. Out of the total 48 hours allotted to the game jam, we

only had about 25 working lab hours. Much time was spent on logistics like the keynote video, brainstorming, pitching, uploading and presenting. Our site also was only open from 9 am to midnight so there was not 24 hour access. With 25 hours of lab time all 11 games at my site were uploaded and ready for presentation.

Presentations -

Global Game Jam - Singapore-MIT GAMBIT Game Lab Games

The best part ever! The presentations were so exciting. Many of the jammers were so focused on their work they were not aware of what other teams were up to. One by one teams went up and presented their games in whatever the current game state was at the deadline.

Most were pretty innovative, experimental and funny. Titles such as The Ouroboros Hangover and Hoop Snake had the jammers in stitches. Fire farting dragons, Hoop Snakes, drunk Ouroboros and so on were big hits. Unity, HTML 5, Flash, Flex, XNA, Comma8 and Flixel were used to create the great games in under 48 hours.

Take Aways -

My teammates and I consider the game we made, Walking Backwards, to be a success.   We accomplished our goals:

Walking Backwards Team - Global Game Jam 2012- Photo courtesy Michael Carriere

  • Experimental game
  • A compelling narrative
  • Awesome audio composition
  • Most functionality we wanted we achieved
  • Runs on an original Commodore 64 with Joysticks
  • Can be played with a Java emulator
  • Got to work together under pressure and have a blast

Would have liked-

  • Avatar to animate properly (we had bi-directional sprites made but not implemented)
  • More audio for sound effects

The final take away I had, besides feeling simultaneously exhilarated and exhausted, is how essential networking at the game jam is for greater success. Beyond just meeting new people, networking at the jam made or broke some games. Some teams didn’t take time to walk around and talk to other teams. In one instance, a team didn’t figure out a essential ghost mechanic by the end of the jam. They realized at presentation time another team had implemented the same mechanic they failed to nail down in the same engine. Networking also provided mutual feedback, play testing, critique, advise, friendships and rounds of beer after the event ended. Many of the jammers now have a better sense of each other’s strengths and weaknesses, their performance under stress, their abilities to collaborate, lead and follow.

I, for one, will be a life long game jammer, ready to collaborate while pushing into both familiar and new territories of game development with various teams, themes and dreams.

Follow this link to see all the games created at my site hosted by the Singapore-MIT GAMBIT Game Labs

——

Elliott Mitchell

Technical Director- Infrared5

Twitter: @ Mrt3d

, , , , ,

Creating 2nd UV sets in Maya for Consistent and Reliable Lightmapping in Unity 3d

January 11th, 2012 by Elliott Mitchell

Lightmaps in the Unity Editor - Courtesy of Brass Monkey - Monkey Golf

Have you ever worked on a game that was beautifully lit in the Unity editor but ran like ants on molasses on your target device? Chances are you might benefit from using lightmaps. Ever worked on a game that was beautifully lit with lightmaps but looked different between your Mac and PC in the Unity editor? Chances are you might want to create your own 2nd UV sets in Maya.

Example of a lightmap

Example of a lightmap

If you didn’t know, lightmaps are 2D textures pre-generated by baking (rendering) lights onto the surfaces of 3D objects in a scene. These textures are additively blended with the 3D model’s original textures to simulate illumination and fine shadows without the use of realtime lights at runtime. The number of realtime lights rendering at any given time can make or break a 3D game when it comes to optimal performance. By reducing the number of realtime lights and shadows your games will play through more smoothly. Using fewer realtime lights also allows for more resources to be dedicated to other aspects of the game like higher poly counts and more textures. This holds true especially when developing for most 3D platforms including iOS, Android, Mac, PC, Web, XBox, PS3 and more.

Since the release of Unity 3 back in September 2010, many Unity developers have been taking advantage of Beast Lightmapping as a one-stop lightmapping solution within the Unity editor. At first glance Beast is a phenomenal time saving and performance enhancing tool. Rather quickly, Beast can automate several tedious tasks that would have needed to be preformed by a trained 3D technical artist in an application like Maya. Those tasks being mostly UV related are:

UVs in positive UV co-ordinate space

  • Generating 2nd UV sets for lightmapping 3D objects
  • Unwrapping 3D geometry into flattened 2D shells which don’t overlap in O to 1 UV co-ordinate quadrant
  • Packing UV shells (arranging the unwrapped 2D shells to optimally fit within a square quadrant with room for mipmap bleeding)
  • Atlasing lightmap textures (combining many individual baked textures into larger texture sheets for efficiency)
  • Indexing lightmaps (linking multiple 3D model’s 2nd UV set UV co-ordinate data with multiple baked texture atlases in a scene)
  • Additively applies the lightmaps to your existing model’s shaders to give 3D objects the illusion of being illuminated by realtime lights in a scene
  • Other UV properties may be tweaked in the Advanced FBX import settings influencing how the 2nd UVs are unwrapped and packed which all may drastically alter your final results and do not always transfer through version control

Why is this significant? Well your 3D object’s original UV set is typically used to align and apply textures like diffuse, specular, normal, alpha texture maps, etc, onto the 3D object’s surfaces. There are no real restrictions on laying out your UVs for texturing. UV’s may be stretched to tile a texture, they can overlap, be mirrored… Lightmap texturing requirements in Unity, on the other hand, are different and require:

  • A 2nd UV set
  • No overlapping UVs
  • UVs and must be contained in the 0 to 1, 0 to 1 UV co-ordinate space

Unwrapping and packing UVs so they don’t overlap and are optimally contained in 0 to 1 UV co-ordinate space is tedious and time consuming for a tech artist. Many developers without a tech artist purchase 3D models online to “save time and money”. Typically those models won’t have 2nd UV sets included. Beast can Unwrap lightmapping UVs for the developer without much effort in the Unity Inspector by:

Unity FBX import settings for Lightmapping UVs

Advanced Unity FBX import settings for Lightmapping UVs

  • Selecting the FBX to lightmap in the Unity Project Editor window
  • Set the FBX to Static in the Inspector
  • Check Generate Lightmap UVs in the FBXImporter Inspector settings
  • Change options in the Advanced Settings if needed

Atlasing multiple 3D model’s UVs and textures is extremely time consuming and not always practical especially when textures and models may change at a moment’s notice during the development process.  Frequent changes to atlased assets tend to create overwhelming amounts of tedious work. Again, Beast’s automation is truly a great time saver allowing flexibility in atlasing for iterative level design plus scene, object and texture changes in the Unity editor.

Sample atlases in Unity

Beast’s automation is truly great except for when your team is using both Mac and PC computers on the same project with version control that is. Sometimes lightmaps will appear to be totally fine on a Mac and look completely messed up on PC and vise versa. It’s daunting to remedy this and may require, among several tasks, re-baking the all the lightmaps for the scene.

Why are there differences between the Mac and PC when generating 2nd UV sets in Unity? The answer is Mac and PC computers have different floating point precisions used to calculate and generate 2nd UV sets for lightmapping upon importing in the Unity editor.  The differences between Mac and PC generated UVs are minuet but can lead to drastic visual problems. One might assume that with version control like Unity Asset Server or Git, the assets would be synced and exactly the same, but they are not. Metadata and version control issues are for another blog post down the road.

What can one to do to avoid issues with 2nd UV sets across Mac and PC computers in Unity? Well, here are four of my tips to avoid lightmap issues in Unity:

Inconsistent lightmaps on Mac and PC in the Unity Editor - Courtesy of Brass Monkey - Monkey Golf

  1. Create your own 2nd UV sets and let Beast atlas, index and apply your lightmaps in your Unity scene
  2. Avoid re-importing or re-generate 2nd UV assets if the project is being developed in Unity across Mac and PC computers when your not creating your own 2nd UV sets externally
  3. Use external version control like Git with Unity Pro with metadata set to be exposed in the Explorer or Finder to better sync changes to your assets and metadata
  4. Use 3rd party editor scripts like Lightmap Manager 2 to help speedup the lightmap baking process by empowering you to be able to just re-bake single objects without having to re-bake the entire scene

Getting Down To Business – The How To Section

If your 3D model already has a good 2nd UV set and you want to enable Unity to use it:

  • Select the FBX in the Unity Project Editor window
  • Simply uncheck Generate Lightmap UVs in the FBXImporter Inspector settings
  • Re-bake lightmaps

How to add or create a 2nd UV set in Maya to export to Unity if you don’t have a 2nd UV set already available?

Workflow 1 -> When you already have UV’s that are not overlapping and contained within the 0 to 1 co-ordinate space:

  1. Import and select your model in Maya (be sure not to include import path info in your namespaces)
  2. Go to the Polygon Menu Set
  3. Open the Window Menu -> UV Texture Editor to see your current UVs
  4. Go to Create UVs Menu -> UV Set Editor
  5. With your model selected click Copy in the UV Set Editor to create a 2nd UV set
  6. Rename your 2nd UV set to whatever you want
  7. Export your FBX with it’s new 2nd UV set
  8. Import the Asset back into Unity
  9. Select the FBX in the Unity Project Editor window
  10. Uncheck Generate Lightmap UVs in the FBXImporter Inspector settings.
  11. Re-bake Lightmaps

Workflow 2 -> When you have UV’s that are overlapping and/or not contained within the 0 to 1 co-ordinate space:

  1. Import and select your model in Maya (be sure not to include import path info in your namespaces)
  2. Go to the Polygon Menu Set
  3. Open the Window menu -> UV Texture Editor to see your current UVs
  4. Go to Create UVs menu -> UV Set Editor
  5. With your model selected click either Copy or New in the UV Set Editor to create a 2nd UV set depending on whether or not you want to try to start from scratch or to work from what you already have in your original UV set
  6. Rename your 2nd UV set to whatever you want
  7. Use the UV layout tools in Maya’s UV Texture Editor to layout and edit your new 2nd UV set being certain to have no overlapping UV’s contained in the 0 to 1 UV co-ordinate space (another tutorial on this step will be in a future blog post)
  8. Export your FBX with it’s new 2nd UV set
  9. Import the Asset back into Unity
  10. Select the FBX in the Unity Project Editor window
  11. Uncheck Generate Lightmap UVs in the FBXImporter Inspector settings.
  12. Re-bake Lightmaps

Workflow 3 -> Add a second UV set from models unwrapped in a 3rd party UV tool like Headus UV or Zbrush to your 3D model in Maya

  1. Import your original 3D model into the 3rd party application like Heads UV and layout your 2nd UV set being certain to have no overlapping UV’s contained in the 0 to 1 UV co-ordinate space (tutorials to come)
  2. Export your model with a new UV set for lightmapping as a new version of your model named something different from the original model.
  3. Import and select your original Model in Maya (be sure not to include import path info in your namespaces)
  4. Go to the Polygon Menu set
  5. Open the Window Menu -> UV Texture Editor to see your current UVs
  6. Go to Create UVs Menu -> UV Set Editor
  7. With your model selected click New in the UV Set Editor to create a 2nd UV set
  8. Select and rename your 2nd UV set to whatever you want in the UV Set Editor
  9. Import the new model with the new UV set being certain to have no overlapping UV’s all contained in the 0 to 1 UV co-ordinate space
  10. Make sure your two models are occupying the exact same space with all transform nodes like translation, rotation and scale values being the exactly the same
  11. Select the new model in Maya and be sure it’s UV is set selected in the UV Set Editor
  12. Shift select the old model in Maya (you may need to do this in the Outliner) and be sure it’s 2nd UV is set selected in the UV Set Editor
  13. In the Polygon Menu Set goto the Mesh Menu -> Transfer Attributes Options
  14. Reset the Transfer Attributes Options settings to default by File -> reset Settings within the Transfer Attributes Menus
  15. Set Attributes to Transfer all to -> Off except for UV Sets to -> Current
  16. Set Attribute Settings to -> Sample Space Topology with the rest of the options at default
  17. Click Transfer at the bottom of the Transfer Attributes Options
  18. Delete non-deformer history on the models or the UVs will break by going to the Edit menu -> Delete by Type -> Non-Deformer History
  19. Select the original 3D model’s 2nd UV set in the UV Set Editor window and look at the UV Texture Editor window to see it the UV’s are correct
  20. Export your FBX with it’s new 2nd UV set
  21. Import the Asset back into Unity
  22. Select the FBX in the Unity Project Editor window
  23. Uncheck Generate Lightmap UVs in the FBXImporter Inspector settings.
  24. Re-bake Lightmaps

Once you have added your own 2nd UV sets for Unity lightmapping there will be no lightmap differences between projects in Mac and PC Unity Editors! You will have ultimate control over how 2nd UV space is packed which is great for keeping down vertex counts from your 2nd UV sets, minimize mipmap bleeding and maintain consistent lightmap results!

Keep an eye out for more tutorials on UV and Lightmap troubleshooting in Unity coming in the near future on the Infrared5 blog! You can also play Brass Monkey’s Monkey Golf to see our bear examples in action.

-Elliott Mitchell

@mrt3d on Twitter

@infrared5 on Twitter

, ,

Aerial Combat in Video Games: A.K.A Dog Fighting

September 27th, 2011 by John Grden

A while back, we produced a Star Wars title for Lucas Film LTD. called “The Trench Run” which did very well on iPhone/iPod sales and later was converted to a web game hosted on StarWars.com.  Thanks to Unity3D’s ability to allow developers to create with one IDE while supporting multiple platforms, we were able to produce these 2 versions seamlessly!  Not only that, but this was one of our first releases that included the now famous Brass Monkey™ technology, which allows you to control the game experience of The Trench Run on StarWars.com with your iPhone/Android device as a remote control.  [Click here to see the video on youtube]

Now, the reason for this article is to make good on a promise I made while speaking at Unite2009 in San Francisco.  I’d said I would go over *how* we did the dog fighting scene in The Trench Run, and I have yet to do so.  So, without further delay…

Problem

The problems surrounding this issue are a few fold:

  1. How do you get the enemy to swarm “around you”?
  2. How do you get the enemy to attack?
  3. What factors into convincing AI?

When faced with a dog fight challenge for the first time, the first question you might have is how do you control the enemy to keep them flying around you (engage you), thus one of the most important ones is how to achieve the dog fight AI and playability.  The issue was more than just simple mechanics of how to deal with dog fighting, it also included issues with having an “endless” scene, performance issues on an iDevice and seamlessly introducing waves of enemies without interrupting game flow and performance.

Solution

In debug mode - way point locations shown as spheres

The solution I came up with, was to use what I would call “way points”.  Way points are just another term for GameObjects in 3D space.  I create around 10-15 or so, randomly place them within a spherical area around the player’s ship and anchor them to the player’s ship so that they’re always relatively placed around the player ( but don’t rotate with the player – position only ).  I use GameObjects and parent them to the a GameObject that follows the player, and this solves my issue of having vectors always positioned relative to the player.  The enemies each get a group of 5 way points and continually fly between them.  This solves the issue of keeping the enemies engaged with the player no matter where they fly and allows the enemy the opportunity to get a “lock” on the player to engage.   Since the way points move with the player’s position, this also creates interesting flight patterns and behavior for attacking craft, and now we’ve officially started solving our AI problem.

Check out the Demo.  Get the files

Check out the demo – the camera changes to the next enemy that gets a target lock on the player ship (orange exhaust).  Green light means it has a firing lock, red means it has a lock to follow the player.

Download the project files and follow along.

Setting up the Enemy Manager

The Enemy manager takes care of creating the original way points and providing an api that allows any object to request a range of way points.   Its functionality is basic and to the point in this area.  But it also takes care of creating the waves of enemies and keeping track of how many are in the scene at a time (this demo does not cover that topic, I leave that to you).

First, we’ll create random way points and scatter them around.  Within a loop,  you simply use Random.insideUnitSphere to place your objects at random locations and distances from you within a sphere.  Just multiply the radius of your sphere (fieldWidth) by the value returned by insideUnitSphere, and there you go – all done.

Now, the method for handing out way points is pretty straight forward.  What we do here is give our enemy craft a random set of way points.  By doing this, we’re trying to avoid enemies having identical sets and order of way points given to each enemy.

NOTE:  You can change the scale of the GameObject that the way points are parented to and create different looking flight patterns.  In the demo files, I’ve scaled the GameController GameObject on the Y axis by setting it to 2 in the IDE.  Now, the flight patterns are more vertical and interesting, rather than flat/horizontal and somewhat boring.  Also, changing the fieldWidth to something larger will create longer flight paths and make it easier to shoot enemies.  A smaller fieldWidth means that they’ll be more evasive and drastic with their moves.  Coupled with raising the actualSensivity, you’ll see that it becomes more difficult to stay behind and get a shot on an enemy.

Setting up the enemy aircraft

The enemy needs to be able to fly one their own from way point to way point.  Once they’re in range of their target, they randomly select the next way point.   To make this look as natural as possible, we continually rotate the enemy until they’re within range of “facing” the next target and this usually looks like a nice arc/turn as if a person were flying the craft.  This is very simple to do thankfully.

First, after selecting your new target, update the rotationVector (Quaternion) property for use with the updates to rotate the ship:

Now, in the updateRotation method, we rotate the ship elegantly toward the new way point, and all you have to do is adjust “actualSensitivity” to achieve whatever aggressiveness you’re after:

As you’re flying, you’ll need to know when to change targets.  If you wait until the enemy hits the way point, it’ll likely never happen since the way point is tied to the player’s location.  So you need to set it up to see if it’s “close enough” to make the change, and you need to do this *after* you update the enemy’s position:

Enemy flying to his next way point



You can also simply change the target for an enemy on a random timer – either way would look natural.

NOTE:  Keep the speed of the player and the enemy the same unless you’re providing acceleration controls to match speeds.  Also, keep in mind, that if your actualSensitivity is low (slow turns), and your speed is fast, you will have to make the bufferDistance larger since there is an excellent chance that the enemy craft will not be able to make a tight enough turn to get to a way point, and will continue to do donuts around it.  This issue is fixed if the player is flying around, and is also remedied by using a timer to switch way point targets.    You can also add code to make the AI more convincing that would suggest that way points are switched very often if the enemy is being targeted by the player (as well as increasing the actualSensitivity to simulate someone who is panicking).

Targeting the Player

Red means target aquired : Green means target lock to fire

The next thing we need to talk about is targeting, and that’s the 2nd part of the AI.  The first part is the enemy’s flight patterns, which we solved with way points.  The other end of it is targeting the player and engaging them.  We do this by checking the angle of the enemy to the player.  If that number falls within the predefined amount, then the currentTarget of the enemy is set to the player.

The nice part about this is that, if the player decides to fly in a straight path, then eventually (very soon actually) all of the baddies will be after him and shooting at him because of the rules above.  So, the nice caveat to all of this is that it encourages the player fly evasively.  If you become lazy, you get shot 0.o

You can also change the property “actualSensitivity” at game time to reflect an easy/medium/hard/jedi selection by the player.  If they choose easy, then you set the sensitivity so that the enemy reacts more slowly to the turns.   If it’s Jedi, then he’s a lot more aggressive and the “actualSensitivity” variable would be set to have them react very quickly to target changes.

Firing

And finally, the 3rd part to the AI problem is solved by having yet another angle variable called “firingAngle”.  “firingAngle” is the angle that has to be achieved in order to fire.  While the angle for changing targets is much wider (50), the ability to fire and hit something is a much tighter angle ( >= 15 ).  So we take the “enemyAngle” and check it against “firingAngle” and if it’s less, we fire the cannons on the player.  You could also adjust the “firingAngle” to be bigger for harder levels so that the player’s ship falls into a radar lock more frequently.

In the sample, I added an ellipsoid particle emitter/particle animator/particle renderer to the enemy ship object set the references to the “leftGun / rightGun” properties and unchecked “Emit” in the inspector. Then, via the Enemy class, I simply set emit to true on both when its time to fire:

Conclusion

So, we’ve answered all 3 questions:

  1. How do you get the enemy to swarm “around you”?
  2. How do you get the enemy to attack?
  3. What factors into convincing AI?

With the way point system, you keep the enemy engaged around you and the game play will be very even and feel like a good simulation of a dog fight.  The way points keep the enemy from getting unfair angles and provide plenty of opportunity for the player to get around on the enemy and take their own shots, as well as provide flying paths that look like someone is piloting the ship.  And adjusting values like “actualSensitivity”, “fieldWidth” and “firingAngle” can give you a great variety of game play from easy to hard.  When you start to put it all together and see it in action, you’ll see plenty of room for adjustments for difficulty as well as getting the reaction and look you want out of your enemy’s AI.

Have a Bandit Day!

Smartphones Consolidate to Three Platforms

January 31st, 2011 by Chris Allen

Smartphone PlatformsJust last week Sony announced that they would be supporting Android applications on their new NGP (Next Generation Portable), the highly anticipated successor to the PSP. They also announced that they would be allowing content created for the NGP would be available on other Android devices, making the PSP games that developers have built available on a wide range of non-Sony devices. Sony is calling this feature the PlayStation Suite. Essentially it’s a store run by Sony for Android, where users can purchase PlayStation games for their tablets and smartphones. This is a bold new move for a company that in the past has stuck to their own monolithic platform over which they kept complete and total control.

Nokia also looks like they may be going with a similar plan. Rumors are everywhere declaring that Nokia will either be choosing Android or Windows Phone 7 to run on their devices. Nokia CEO, Stephen Elop was quoted saying “In addition to great device experiences we must build, capitalise and/or join a competitive ecosystem”, implying that they are looking to make a move. While it’s clear that Nokia hasn’t settled on Android yet, the very fact that they are looking for a switch indicates the industry is moving towards consolidating into three smartphone operating systems.

In other news, there seem to be reliable sources stating that RIM may be doing something similar with future Blackberry devices. If BlackBerry and Nokia run Android apps, and Sony devices do as well, this is very good news for mobile game developers. Why is that? Quite simply because there will be less platforms to port to.

Already a huge number of game developers are moving to Unity 3D, a game development platform that allows for easy deployment to iOS, Android, their own Web player, Nintendo Wii and xBox 360. Using Unity the developer needs to write one code base that will work across multiple platforms with relatively minor tweaks. The fact that Unity already supports two of the main smartphone platforms (iOS and Android) is a huge win for mobile game developers!

With Sony support for Android apps on PSP, and RIM and Nokia possibly doing the same, this just means more devices we as game developers can target. Of course with our sister company’s platform, Brass Monkey, we also are going to have more consumers that will be able to turn their devices into controllers, and that’s definitely a good thing for us. Will the consolidation of operating systems in the market help your business? Are you a mobile game developer and think this is good news too? I would love to hear your feedback in the comments.

, , , , ,

Big Games, Great Prices

September 30th, 2010 by Mike Oldham

For those of you who haven’t experienced the awesomeness of Star Wars:Trench Run, it’s on sales now in the App Store for $1.99 (originally $4.99). Don’t forget to download the Brass Monkey controller (in-app purchase) for a truly unique gaming experience that will blow your mind!

, , , ,

FITC Mobile 2010

September 3rd, 2010 by Mike Oldham

FITC Mobile 2010

Back for its second year, FITC Mobile is one of the only events covering all aspects of mobile content development – jam-packed with presentations, demonstrations, and panel discussions. With some of the most interesting and engaging presenters from around the world, FITC Mobile is two days and nights that will leave you inspired, energized and awed!

This year Infrared5′s CEO, Chris Allen, will present the new cutting edge Brass Monkey SDK for creating cross platform experiences. Chris will open your eyes to the possibilities and how Brass Monkey fits into the convergence of mobile and web.

Don’t miss out on an outstanding lineup of speakers and sessions and buy your tickets before it’s too late. It’s going to be another great conference and I hope to see you all there! Be sure to register with the discount code: infrared5 for $50 off.

, ,

Boston Game Loop Unconference Recap

September 2nd, 2010 by Chris Allen

This past weekend I attended Boston Game Loop, and I was really glad I did. Boston Game Loop is an independently run conference conference organized by Darius Kazemi and Scott Macmillan. The format of the event was unique and quite a refreshing break from the standard session format of most conferences.

GL062

The morning started off with breakfast and then a large gathering of the whole group to pick the topics for the day’s sessions. I unfortunately didn’t make it down there for this phase of the conference, but when I arrived I was pleased that my fellow attendees had picked many interesting sessions that I wanted to attend.

I got there just as the second sessions were starting, and I decided to check out the demos to see what other game developers were creating. I also put my name on the list to present Brass Monkey to the group, which I was able to do at the very end. We saw quite a variety of games during the meeting ranging from RPGs to, Flash based side scrollers and puzzle games. One of the big standouts for me was Elude, a side scroller game that was intended to make the player understand the effects of depression. The mechanics of the game were pretty standard, allowing the character to run, jump on to platforms (trees in this case), and had some other little mini games like seeing how high the character could jump (think, the mini cloud jumping game in the iPhone hit Pocket Gods). Although the game was rather conventional in many respects the key part that made this an interesting game was that the powers of the player character were directly tied to how happy or depressed he was. When the character become horribly depressed he couldn’t jump, or move too quickly, and the environment changed to be very dark and foreboding and eventually he falls into a literal pit of despair and the game is over. While I don’t suffer from depression, I’ve definitely had days were I was in a pretty bad funk, and this game really did get you to relate to how severe depression would make you feel.

At lunch I sat down with Scott Payne of Amherst College and had a really exciting discussion on MMOs, Unity 3D and the role of education in gaming. Scott is a really great guy, and extremely knowledgeable into the techniques of learning and how it applies to games.

As I’ve been really interested in game design lately, I decided to attend the session on Narrative Design next. The moderators did an excellent job on moving the discussions forward and on topic. The group discussed the role of the narrative and story telling in games. Most of what was talked about was very insightful, but focused mostly on large scale console based first person shooters in the apocalyptic style, think Bio Shock and Massive Effect.  This made sense as many of the people in the room either designed one of the games, or had been mostly focused in this space. With that, I still did find the talk very useful, and loved hearing these people talk about the role of choice to affect a narrative, and how the use of cut scenes aren’t necessarily the way to create the story in a game. I’m personally not a huge fan of the cut scene myself. Another interesting aspect discussed was how changing the perspective from 1st person to 3rd or even 4th gives the player a different feeling and changes the narrative for the user. If you are immersed in a first person perspective game for example, you can allow yourself to really imagine yourself in the game; when you put the perspective in a more top down style, then it gives the player a more controlling god like feel.

Next I was going to go to Scott MacMillan’s session on marketing social games, but I got side tracked and jumped in the middle of an interview that Dave Bolton from Bostinnovation was conducting with Yilmaz Kiymaz and Elliott Mitchel about the Boston gaming community. We ended up having a great conversation and came up with a little game concept of our own called the Dog House. The Dog House is a game where your goal is to stay in your significant-other’s good graces by answering questions that this AI driven character would ask.  Conferences like Game Loop are really great for networking, and one of the biggest values to me is that it allows for game developers to get together and have impromptu brain-storming sessions like this.

For the next session I went to the one on side projects run by Darren Torpey. The discussion went into how to make a side project a success, and there was lots of talk about how to pick good collaborators, how to keep up momentum and how to maintain positive morale when folks aren’t necessarily getting paid for the work they are doing. We also covered how to run a project with people in different locations and busy schedules. I got to talk about how we made Red5 a success and ultimately launched Infrared5. It was cool to reflect on how our company was essentially founded based on a successful side project. Of course our latest side project to turn into a real project is Brass Monkey, and while most of the conversations focused on individuals running side projects, it was great to be able to talk about how a company can fit in a side project and make it a success as well. Overall this session was insightful, and hopefully I helped inspire some of the other game developers that were there too.

For the last session of the day I attended Predicting the Future that was led by Ben Sayer. This was essentially a round table discussion on what we would bet would happen in the games industry within one year. I was pleased to find that one of the things that was brought up and unanimously agreed would happen is that connecting mobile devices as controllers for games would be a big thing by next year. Someone else in the audience actually brought up Brass Monkey as the catalyst of this movement. I sure hope they are right, as this spells a sure success for our product. Other predictions included Zynga would go public or be acquired within a year, Nintendo’s 3DS product would be a huge success, 3D TVs will be a flop, 38 Studios wouldn’t actually move to Rhode Island and more than likely wouldn’t launch a game by next year, HTML5 would be used to a lot more games by next year and a whole lot more. I look forward to seeing which predictions come true at next year’s Game Loop.

After the sessions ended the majority of the group went over to the Cambridge Brewing Company for drinks and dinner.

Overall I thought Game Loop 2010 was a great success. I know there’s been a lot of talk about it being too big this year, and it was a bit too unorganized with so many folks involved. I do think there’s an element to truth in that, but the end result for me was still a valuable experience. One suggestion I would make to the organizers is to have the planning session for the event the night before. This way for people that are fine with attending the sessions that others have picked, they could skip this step and just show up the next day. The people that really want to set the agenda can attend the planning part. The planning session could then end and turn into a night of socializing and getting to know the other attendees. Then everyone comes back fresh and ready to attend all the sessions they planned for the conference in the morning. Another suggestion is to group the sessions on the board by topic, so that people can more easily focus on the areas they are interested in.

Did you attend Game Loop this year? What did you think, and how could it be improved for next year? I would love to hear your feedback!

Trailer: Star Wars Trench Run 2.0

August 4th, 2010 by Mike Oldham

Star Wars: Trench Run 2.0 Trailer from THQ Wireless on Vimeo.

, , , , , ,

Star Wars: Trench Run 2.0 & Brass Monkey Launch

July 21st, 2010 by Mike Oldham

Many of you may have gotten small tastes of Star Wars: Trench Run 2.0 and the Brass Monkey game controller from recent videos and articles. It’s literally been months of anticipation, but the moment we’ve been waiting for has finally arrived. We are pleased to announce that Trench Run 2.0 and the Brass Money controller have finally gone live in the iTunes store and on Starwars.com.
Read the rest of this entry »

, , , , , , ,

Brass Monkey Interview with Boston Innovation’s Kyle Psaty

July 7th, 2010 by Mike Oldham

Big thanks to Kyle for coming out and putting together this great video! Check out BostonInnovation for more information on local start-ups.
Read the rest of this entry »

, , , , , , ,

« Previous Entries Next Entries »