Marvel Heroes @ PAX Trailer

 

My lighting lighting does not hold up when its not at game angle or distance.  All the texture compression settings are not designed to be zoomed in like this either, so not happy with marketing right now..

How time flies when you work work work.

First I’m working on a Marvel Heroes @ Secret Identity Studios.  Here are some links.

http://kotaku.com/5864679/several-avengers-assemble-for-the-marvel-universe-mmo/gallery/1

https://www.marvelheroesonline.com/

Second! I’m writing this post with a fever.

I’ve been stretching out my skills into other avenues of Technical Artistry for the last year or so on MU.  Having a few brief encounters with Unreal Ed on some unnamed in between projects here and there I was asked if I’d want to work on the VFX pipeline and scripting.  Then after less than 6 months and some colleagues having to make some hard decisions.. I was thrown head first into everything as the ONLY tech artist on the project.

Oh Unreal how I heart thee, before we gutted the crap out of you, and broke you so badly.   It really has a senior tool set and a place where a tech artists has a hard choice to make in being a support role or a content generation role.  My art side won out and they put me in charge of lighting.

PSST! All the really really nifty shit is super technical but still kinda artist friend? Tech artist friendly at least?!  Once you get past that entire we broke the engine’s pipeline, and its already enormous learning curve. (Much like learning to love a 3d package from scratch again)

Unreal is far from perfect, and I’ve cursed many profanity’s over the months.  My personal favorite curse.. Oh this gets me every time I do it.. Is that a material instance allows itself to be plugged into its own parent…Really?!.. And its very easy to do with the interface when you have two content browsers opened.   This results in a complete crash and me crying a little.

Hope to be allowed screen shots soon!

 

 

 

 

Fortune Online VFX

 

At about the same time as the Offbase PS3 gig I was doing on the side, The management lent some of the dev’s, myself included, to another sister studio working on a flash title.  So for a few months I was plunking away at some VFX sprite work.

In the video examples, if it moves, I made it.

The pipeline challenge was the very low frame rate that VFX had to be produced with.   All the fire loops were 15 frames long total.  And many reuse the same sprite sheets at different scales and layering to cut down on download footprint.  And these were probably the most expensive loops I made.  Most were 3-5 frames, and we rocked them back and forth to get more out of them.  Many others were just animated by hand, or single frame renders edited in Photoshop.

However; since it was sprites, the sky was the limit for poly’s and particles!  The fire and steam source images were both made using Maya fluid, and then layered together with some compositing  tricks in After Effects.  The base art with the braziers, torch handle, and chandelier was provided by their modeling artists to us as sprites themselves, and the final VFX were authored to layer over them in Flash.

Fortune Online is free to play and currently live.

Project Redwood

 

I came on in the middle of Project Redwood, but I’d call it the beginning for the Art.  We were in 3dsMax at the time, and I had a massive list of rigs and content ahead of me, and there was zero infrastructure in place.  There was a game, a server, a look, but within a few months the entire art team reorganized under a new AD, and we rebooted the art look.

I can’t really say I had much to do with the look shy of the squash and stretch functionality in the rigs.  The new AD got a strike force together and started over in environments to get the new look, and slowly worked their way over into characters.  This was great because it gave me a chance to work ahead on the pipeline tools and procedures.

BigWorld was tough to work with to put it nicely.  To its defense we were trying to get it to do stuff it wasn’t intended to do.  By the time the project was put on hold, we had one hell of an art pipe cranking though.  Thanks to the BlueShift guys that came in and re-wrote the exporter under the hood.

I was a senior tech artist working with the lead character modeler to create a visual language and system for the character outfit customization. I was in charge of the two avatar rig level pipelines from animation systems to visual systems.  One was the main player avatars, 3 races, and 2 genders each.  All with 8 or more weapon systems, including musical instruments, full facial animation, squash and stretch, in Maya skin browser, in Maya weapon switching, holster logic, and 16 art slots for armor and clothing along with the logic to hook it up in game.  Another avatar level rig for the pet class, also with armor customization, squash and stretch and full facial systems.  I did some of the mobs when they could use an existing pet rig class, but I was also writing animation tools and automating the export pipeline for environments, props, mobs, avatars, pets, and avatar customization.   I also rigged up the loot drop so that it squashed and stretched when it popped into the world.

 

Below is some more game play footage:

 

Star Trek Online Perpetual Entertainment

 

Star Trek Online was the project that I grew the most on.  We started from the ground up, and built our own engine, and thus our own tools and art pipeline.  We were a small team for the 3 years the project was going on as Gods and Heroes delayed, and became the studio’s focus.  We’d even stop work on STO to pitch a hand now and again to the GnH team.

I worked remotely as a rigger, modeler, aniamtor for the GnH team prior to starting at Perpetual while at New Pencil. This made me familiar with GnH’s art process and their rigger John Stewart.  He and I collaborated on the character rigging art pipeline for STO and created a joint library of support scripts, tools, processes, and morgue body parts that both sets of animators could work on.

This was the first time I got to do a bone driven facial system for in game.  Some of the earlier tests with the Bimphalian race are shown.  The final system was influenced via Jason Osipia’s Stop Staring for its UI and visems. However, the similarities stopped there. Instead of a single set of blendshapes based on the character mesh, I had multiple small simple geo blendshapes, that were outputting to bone translation and rotations.  The components for the face this way could be rearranged, re-proportioned and scaled for the different aliens quickly without having to re-wire things from scratch.  Yet I could keep the complex motions of the facial motion system simplified under the visem sliders for fast animation.  Also by keeping the visem sliders all 0-1, and the same visem channels, facial animation which was so costly do to, was sharable between all facial rigs.

I also wrote a quick auto rigger and skinner for the star ships for warp deformation.

On this video collage, I’ve only done the rigging content.  The process for me started from commenting on the concept art for modelers, pre-planning motion with the animators, rigging, animation processing tools for exporting, and game side hookup.

 

Morgue Rig Comp

 

This video showcases some of the rigs generated from the morgue rigging system a colleague (John Stewart) and I implementing back at Perpetual Entertainment.

The general workflow and idea was to rely on Maya’s referencing system, and strong duplication management.  Each body part was crafted by hand with script support and kept in its own file.  Common layers of mechanics, inputs, and outputs were premeditated heavily based on 4 previous auto rigging systems him and I had created.

The reason we both went away from a single auto rigging solution was the time constraints we were under to implement features.  Gods and Heroes, the initial title that saw some of these rigs had over 72 minutes of hand animation done by a team of 9+ animators over the course of 5 years of production.  We both concluded that scripting everything often took longer than getting a working component up and ready.  And by coming up with a statute for inputs and outputs filtered via layers of engineering we had a common ground for sharing and adapting animations from rig to rig quickly offline.

This rigging meta was then applied further through Perpetual Entertainment’s: Star Trek Online, Slipg8 Ironworks’ Project Redwood, and various in between prototype titles at both studios.

 

Anticlimactic Blog Rebirth

Welcome!

Well, I decided to try a random site blog backend and it didn’t go so well.  Here’s the reboot’s first post as I hop onto the wPress bandwagon.

My current task is to go through some 20 odd video plug-ins that I’ve downloaded to sort out a media delivery pipe that makes me content.  I’m perhaps trying a little too hard to not post my content on youtube, vimeo, or other media share sites, but as I’ve stayed in the 3D world primarily these last 9 years, my web development knowledge could use a dusting off of sorts.

The sites fluidity will be changing as I continue to learn about wPress’s feature set.

-rive

Edit – shared media sites win out!