Multithreading Perceptual Computing Applications in Unity3d

August 21st, 2013 by Steff Kelsey

Once again we are excited about being featured on the Intel Developer Zone blog for a post done by Infrared5 Senior Developer Steff Kelsey. This post Multithreading Perceptual Computing Application in Unity3d talks about the challenges the Infrared5 team faced while participating in the Intel Ultimate Coder Challenge. Follow the link to hear the steps we went through to make Kiwi Catapult Revenge a reality! As always, we love hearing your feedback and feel free to share with others.

http://software.intel.com/en-us/blogs/2013/07/26/multithreading-perceptual-computing-applications-in-unity3d

, , , , ,

Trajectory of a Basketball in Unity3D

July 2nd, 2013 by Steff Kelsey

This post discusses targeting projectiles with Unity3D. The main applications of targeting projectiles using physics are sports simulations (basketball, golf, etc) and anything where you want to launch something into a world with gravity and have it hit the desired target.

Read the rest of this entry »

, , ,

Unity 4 – Looking Forward

September 28th, 2012 by Anthony Capobianchi

Here at Infrared5, a good portion of our projects are based in Unity 3D. Needless to say, with the introduction of Unity 4, I was very interested in what had changed about the coming engine, and why people should make the upgrade. This post will look at few of the features of Unity 4 that I am most excited about.

The New GUI

The first time I sat down with Unity almost a year ago to work on Brass Monkey’s Monkey Dodgeball game, I knew practically nothing about the engine. That didn’t stop me from being almost immediately annoyed with Unity’s built in GUI system. Positioning elements from the OnGUI was a task of trial and error, grouping objects together was a pain, and all the draw calls that it produced made it inefficient to boot. At that time, I was unaware of the better solutions to Unity’s GUI that were developed by third party developers, but after I was made aware, I was confused as to why such a robust development tool such as Unity didn’t have these already built in.

Though the new GUI system is not a launch feature for Unity 4, Unity is building an impressive system for user interface that will allow for some really interesting aesthetics for our games. From the looks of it, the new system seems to derive from Unity’s current vein of GUIText and GUITexture objects. The difference is in the animation capabilities of each element that is created. You are now allowed to efficiently have multiple elements make up your GUI objects such as buttons, health bars, etc. Unity then allows you to animate those elements individually. Not to mention that editing text styles in the GUI is now as easy as marking it up with HTML.

One of the coolest additions is the ability to position and resize any UI element with transform grabbers that anyone who has used an Adobe product would be familiar with. This also allows for the creation of rotating elements in 3D space, which allows for creating a GUI with a sense of space and depth to it. This can lead to some really interesting effects.

The new GUI system will come packaged with pre-built controls, though there is no word as to whether or not those controls will be customizable. Unity lists one new control as a “finely tuned thumbstick [controls] for mobile games”.  A couple of months ago, I developed my own thumbstick like controls to maneuver in 3D space, and it was a pain. Hopefully these new controls will make it a lot easier. You can also easily create your own controls by extending from the GUIBehavior script. Developers should have no problem creating controls that handle the specifics of their own games.

Every image that you use to create your elements gets atlases automatically. This is a huge bonus over the old GUI system. The biggest problem Unity’s GUI system has right now is the amount of draw calls it makes to render all those elements. Third party tools like EZGUI and iGUI rely on creating UI objects that atlas images to reduce draw calls. It will be nice to have that kind of functionality in a built in system. I’ve spent a lot of time developing user interfaces in Unity over the past few months, so it makes me really excited to see that Unity is trying to correct some of their flaws for creating a component that is so important to games.

Mecanim
Unity’s current animation system is pretty basic- add animations to an object and trigger those animations based on any input or conditions that are needed. The animation blending was useful but could have been better. With Unity 4, it is better. Introducing the Mecanim: an animation blending system that uses blending trees for models with ridged bones to fluidly move from one animation to another. One of the biggest hurdles that we as developers need to overcome in projects that deal with a lot of animations is transitioning from those animations as seamlessly as possible. Not always easy!

Along with blending the animations, Mecanim allows you to edit your animations similar to how you would edit a film clip to create animations loops. Mecanim also supports IK, so for example it can change the position of a characters feet on uneven surfaces, bend hands around corners, etc. A couple of years ago I was fascinated by Natural Motion’s Endorphin engine for animation blending. Mecanim may not be as sophisticated as Endorphin, and only supports biped skeletons, but it seems like an incredible system that comes built in to Unity.

The best part is about this is that once you create a blend tree for your animations, you can drag and drop it onto another rigged model, and it will work even if the new model is a different size or proportion.


The Mobile Platform

The mobile scene is really where Unity shines. Most of the Unity projects I have worked on for Infrared5 have had some sort of mobile component to them. The mobile platform is going to get even better with Unity 4. The most interesting thing from a developer’s standpoint is the profiling system, which allows you to view your game’s GPU performance to determine where it runs smoothly, and where it needs more optimization. The addition of real-time shadows for mobile is a nice added bonus. It will definitely add a lot of aesthetic value to the products we make.

Unity 4 is going to hit the industry with amazing force. I, for one, cannot wait to get my hands on this engine and am already filled with ideas on how I want to utilize these new tools. My favorite part is going to be the mobile optimization. Mobile development is huge, and it’s not going anywhere anytime soon. With all the new capabilities of Unity’s mobile content, I should be kept interested for quite a while.

, , , ,

Boid Flocking and Pathfinding in Unity

June 20th, 2012 by Anthony Capobianchi

The quest for creating believable, seemingly intelligent movement among groups of characters is a challenge many game developers encounter. A while back, I had an idea for an app that required units of moveable objects to be able to coordinate and get from point ‘A’ to point ‘B’., I’ve never built anything like an RTS before, so this concept was something new to me. I searched forums and articles looking for the answers on how people achieve this sort of behavior. The majority of the help I could find was in a system referred to as “Boid” or “Flocking”, A Boid system can be set to simulate flocks or herds, allowing moving units to animate independently, giving the illusion of artificial intelligence. Over the next three blog posts, I will outline solutions to the following problems:

  1. Boid System – Creating a system to keep objects together in a coherent way.
  2. Radar Class – Creating a class that will detect if any Boids are within a certain distance from another Boid.
  3. Path and Obstacle Avoidance – Getting an object to follow a path as well as avoid obstacles that keep them from getting to their destination.
  4. Ray Caster – Setting up a ray caster that will be used to place a destination object in the scene for the Boids.
  5. Destination Points – Organizing a number of different destination points for the Boids that will prevent   them from piling on top of each other.

BOID SYSTEM

Normally, I would split up the functions into different scripts depending on their function, for instance, a script for calculating the Boid behavior force, a script for the radar, and a script for  path calculating. However, to keep the count down and to avoid possible confusion of not always knowing which script a function goes into, I consolidated it into only a few scripts -

I.    Boid.cs
II.    Destination.cs
III.    DestinationManager.cs

NOTE: All properties and variables should go at the top of your classes, in my code examples I am putting the properties above the methods to show you which properties you need in your script and why.

•    Boids

The Boid system is accomplished by creating a script (Which was named Boid.cs) that controls the behaviors of their basic movement. This included coherency, which is the will to stick together, and also separation, which is the will to keep apart. If you are unfamiliar with the concept of Boids or flocking, a great article about it can be found at http://www.vergenet.net/~conrad/boids/pseudocode.html and a C# Unity example can be found here: http://virtualmore.org/wiki/index.php?title=SimpleBoids

To set up my Boid.cs script, I set up these properties:

My Boid behaviors are set up like this:

• RADAR CLASS

We need a way for every Boid to know if there are other Boid objects surrounding it within a certain radius. In order to create this effect, we will create functions that will handle radar scans and what the scanner is looking for. The radar will be called to scan a few times every second. It is not using the Update function to get called. If it was, every frame would be making a collision check using Physics.OverlapSphere. This could cause frame rates to drop, especially if you have a lot of Boids in the scene.

In my Boid.cs script my Radar functions are set up like this:

In my next post, I will explain how I solved the problem of getting an object to follow a path while avoiding obstacles.In addition, I will explain what will be needed to apply the forces that are calculated by the Boid and pathfinding systems to get our characters moving.
Anthony Capobianchi

, , , ,

Creating 2nd UV sets in Maya for Consistent and Reliable Lightmapping in Unity 3d

January 11th, 2012 by Elliott Mitchell

Lightmaps in the Unity Editor - Courtesy of Brass Monkey - Monkey Golf

Have you ever worked on a game that was beautifully lit in the Unity editor but ran like ants on molasses on your target device? Chances are you might benefit from using lightmaps. Ever worked on a game that was beautifully lit with lightmaps but looked different between your Mac and PC in the Unity editor? Chances are you might want to create your own 2nd UV sets in Maya.

Example of a lightmap

Example of a lightmap

If you didn’t know, lightmaps are 2D textures pre-generated by baking (rendering) lights onto the surfaces of 3D objects in a scene. These textures are additively blended with the 3D model’s original textures to simulate illumination and fine shadows without the use of realtime lights at runtime. The number of realtime lights rendering at any given time can make or break a 3D game when it comes to optimal performance. By reducing the number of realtime lights and shadows your games will play through more smoothly. Using fewer realtime lights also allows for more resources to be dedicated to other aspects of the game like higher poly counts and more textures. This holds true especially when developing for most 3D platforms including iOS, Android, Mac, PC, Web, XBox, PS3 and more.

Since the release of Unity 3 back in September 2010, many Unity developers have been taking advantage of Beast Lightmapping as a one-stop lightmapping solution within the Unity editor. At first glance Beast is a phenomenal time saving and performance enhancing tool. Rather quickly, Beast can automate several tedious tasks that would have needed to be preformed by a trained 3D technical artist in an application like Maya. Those tasks being mostly UV related are:

UVs in positive UV co-ordinate space

  • Generating 2nd UV sets for lightmapping 3D objects
  • Unwrapping 3D geometry into flattened 2D shells which don’t overlap in O to 1 UV co-ordinate quadrant
  • Packing UV shells (arranging the unwrapped 2D shells to optimally fit within a square quadrant with room for mipmap bleeding)
  • Atlasing lightmap textures (combining many individual baked textures into larger texture sheets for efficiency)
  • Indexing lightmaps (linking multiple 3D model’s 2nd UV set UV co-ordinate data with multiple baked texture atlases in a scene)
  • Additively applies the lightmaps to your existing model’s shaders to give 3D objects the illusion of being illuminated by realtime lights in a scene
  • Other UV properties may be tweaked in the Advanced FBX import settings influencing how the 2nd UVs are unwrapped and packed which all may drastically alter your final results and do not always transfer through version control

Why is this significant? Well your 3D object’s original UV set is typically used to align and apply textures like diffuse, specular, normal, alpha texture maps, etc, onto the 3D object’s surfaces. There are no real restrictions on laying out your UVs for texturing. UV’s may be stretched to tile a texture, they can overlap, be mirrored… Lightmap texturing requirements in Unity, on the other hand, are different and require:

  • A 2nd UV set
  • No overlapping UVs
  • UVs and must be contained in the 0 to 1, 0 to 1 UV co-ordinate space

Unwrapping and packing UVs so they don’t overlap and are optimally contained in 0 to 1 UV co-ordinate space is tedious and time consuming for a tech artist. Many developers without a tech artist purchase 3D models online to “save time and money”. Typically those models won’t have 2nd UV sets included. Beast can Unwrap lightmapping UVs for the developer without much effort in the Unity Inspector by:

Unity FBX import settings for Lightmapping UVs

Advanced Unity FBX import settings for Lightmapping UVs

  • Selecting the FBX to lightmap in the Unity Project Editor window
  • Set the FBX to Static in the Inspector
  • Check Generate Lightmap UVs in the FBXImporter Inspector settings
  • Change options in the Advanced Settings if needed

Atlasing multiple 3D model’s UVs and textures is extremely time consuming and not always practical especially when textures and models may change at a moment’s notice during the development process.  Frequent changes to atlased assets tend to create overwhelming amounts of tedious work. Again, Beast’s automation is truly a great time saver allowing flexibility in atlasing for iterative level design plus scene, object and texture changes in the Unity editor.

Sample atlases in Unity

Beast’s automation is truly great except for when your team is using both Mac and PC computers on the same project with version control that is. Sometimes lightmaps will appear to be totally fine on a Mac and look completely messed up on PC and vise versa. It’s daunting to remedy this and may require, among several tasks, re-baking the all the lightmaps for the scene.

Why are there differences between the Mac and PC when generating 2nd UV sets in Unity? The answer is Mac and PC computers have different floating point precisions used to calculate and generate 2nd UV sets for lightmapping upon importing in the Unity editor.  The differences between Mac and PC generated UVs are minuet but can lead to drastic visual problems. One might assume that with version control like Unity Asset Server or Git, the assets would be synced and exactly the same, but they are not. Metadata and version control issues are for another blog post down the road.

What can one to do to avoid issues with 2nd UV sets across Mac and PC computers in Unity? Well, here are four of my tips to avoid lightmap issues in Unity:

Inconsistent lightmaps on Mac and PC in the Unity Editor - Courtesy of Brass Monkey - Monkey Golf

  1. Create your own 2nd UV sets and let Beast atlas, index and apply your lightmaps in your Unity scene
  2. Avoid re-importing or re-generate 2nd UV assets if the project is being developed in Unity across Mac and PC computers when your not creating your own 2nd UV sets externally
  3. Use external version control like Git with Unity Pro with metadata set to be exposed in the Explorer or Finder to better sync changes to your assets and metadata
  4. Use 3rd party editor scripts like Lightmap Manager 2 to help speedup the lightmap baking process by empowering you to be able to just re-bake single objects without having to re-bake the entire scene

Getting Down To Business – The How To Section

If your 3D model already has a good 2nd UV set and you want to enable Unity to use it:

  • Select the FBX in the Unity Project Editor window
  • Simply uncheck Generate Lightmap UVs in the FBXImporter Inspector settings
  • Re-bake lightmaps

How to add or create a 2nd UV set in Maya to export to Unity if you don’t have a 2nd UV set already available?

Workflow 1 -> When you already have UV’s that are not overlapping and contained within the 0 to 1 co-ordinate space:

  1. Import and select your model in Maya (be sure not to include import path info in your namespaces)
  2. Go to the Polygon Menu Set
  3. Open the Window Menu -> UV Texture Editor to see your current UVs
  4. Go to Create UVs Menu -> UV Set Editor
  5. With your model selected click Copy in the UV Set Editor to create a 2nd UV set
  6. Rename your 2nd UV set to whatever you want
  7. Export your FBX with it’s new 2nd UV set
  8. Import the Asset back into Unity
  9. Select the FBX in the Unity Project Editor window
  10. Uncheck Generate Lightmap UVs in the FBXImporter Inspector settings.
  11. Re-bake Lightmaps

Workflow 2 -> When you have UV’s that are overlapping and/or not contained within the 0 to 1 co-ordinate space:

  1. Import and select your model in Maya (be sure not to include import path info in your namespaces)
  2. Go to the Polygon Menu Set
  3. Open the Window menu -> UV Texture Editor to see your current UVs
  4. Go to Create UVs menu -> UV Set Editor
  5. With your model selected click either Copy or New in the UV Set Editor to create a 2nd UV set depending on whether or not you want to try to start from scratch or to work from what you already have in your original UV set
  6. Rename your 2nd UV set to whatever you want
  7. Use the UV layout tools in Maya’s UV Texture Editor to layout and edit your new 2nd UV set being certain to have no overlapping UV’s contained in the 0 to 1 UV co-ordinate space (another tutorial on this step will be in a future blog post)
  8. Export your FBX with it’s new 2nd UV set
  9. Import the Asset back into Unity
  10. Select the FBX in the Unity Project Editor window
  11. Uncheck Generate Lightmap UVs in the FBXImporter Inspector settings.
  12. Re-bake Lightmaps

Workflow 3 -> Add a second UV set from models unwrapped in a 3rd party UV tool like Headus UV or Zbrush to your 3D model in Maya

  1. Import your original 3D model into the 3rd party application like Heads UV and layout your 2nd UV set being certain to have no overlapping UV’s contained in the 0 to 1 UV co-ordinate space (tutorials to come)
  2. Export your model with a new UV set for lightmapping as a new version of your model named something different from the original model.
  3. Import and select your original Model in Maya (be sure not to include import path info in your namespaces)
  4. Go to the Polygon Menu set
  5. Open the Window Menu -> UV Texture Editor to see your current UVs
  6. Go to Create UVs Menu -> UV Set Editor
  7. With your model selected click New in the UV Set Editor to create a 2nd UV set
  8. Select and rename your 2nd UV set to whatever you want in the UV Set Editor
  9. Import the new model with the new UV set being certain to have no overlapping UV’s all contained in the 0 to 1 UV co-ordinate space
  10. Make sure your two models are occupying the exact same space with all transform nodes like translation, rotation and scale values being the exactly the same
  11. Select the new model in Maya and be sure it’s UV is set selected in the UV Set Editor
  12. Shift select the old model in Maya (you may need to do this in the Outliner) and be sure it’s 2nd UV is set selected in the UV Set Editor
  13. In the Polygon Menu Set goto the Mesh Menu -> Transfer Attributes Options
  14. Reset the Transfer Attributes Options settings to default by File -> reset Settings within the Transfer Attributes Menus
  15. Set Attributes to Transfer all to -> Off except for UV Sets to -> Current
  16. Set Attribute Settings to -> Sample Space Topology with the rest of the options at default
  17. Click Transfer at the bottom of the Transfer Attributes Options
  18. Delete non-deformer history on the models or the UVs will break by going to the Edit menu -> Delete by Type -> Non-Deformer History
  19. Select the original 3D model’s 2nd UV set in the UV Set Editor window and look at the UV Texture Editor window to see it the UV’s are correct
  20. Export your FBX with it’s new 2nd UV set
  21. Import the Asset back into Unity
  22. Select the FBX in the Unity Project Editor window
  23. Uncheck Generate Lightmap UVs in the FBXImporter Inspector settings.
  24. Re-bake Lightmaps

Once you have added your own 2nd UV sets for Unity lightmapping there will be no lightmap differences between projects in Mac and PC Unity Editors! You will have ultimate control over how 2nd UV space is packed which is great for keeping down vertex counts from your 2nd UV sets, minimize mipmap bleeding and maintain consistent lightmap results!

Keep an eye out for more tutorials on UV and Lightmap troubleshooting in Unity coming in the near future on the Infrared5 blog! You can also play Brass Monkey’s Monkey Golf to see our bear examples in action.

-Elliott Mitchell

@mrt3d on Twitter

@infrared5 on Twitter

, ,