Materials

For the final foray into my Butchers Crossing environment I had two things in mind: figure out how to use Substance B2M to incorporate a material on the three horses in my scene and correct the scale of the bison/humans/horses in the scene.

I started by searching the internet for tileable furs that I could use for my horse. I found a few and ended up choosing the one featured at the top of the post. 

Integrating the jpg into B2M was really easy, intuitive and fun to mess around with. The only real issues were the ones that happened once I got back into Unreal and had more to do with my hardware than Unreal’s software. 

Once the material was exported from B2M and placed into Unreal, I messed around with the Texture samples and Texture coordinator, tried masking out different colors and so forth and so on but the differences were so minimal that I didn’t feel as if they really needed to be integrated. Once the fur was actually put on the horse, I wasn’t too pleased with the appearance though I think that may have been as a result of the tileable jpg that I went with. 

In these first five or so weeks of class I have certainly learned a lot about Unreal but the biggest takeaway is that Unreal is not meant for Mac’s. Sure, a Mac can run Unreal and you can use the foundational tools like landscape pretty well but when it comes to getting more advanced and integrating characters, animations and materials, you spend more time waiting for shaders to compile or animations to load than you do actually setting up the atmosphere of your scene.

For example, in the movie featured above with the fur material placed on the horse, it doesn’t look that great because of my graphics card. The characters are more of a study in glitch than they are an exploration of realism. 

I was able to adjust the things that were a bit more under my control though: I corrected the scale of the bisons, characters and horses. I adjusted the trees and bushes a bit to make the environment a bit less sparse, too. 

With all that said, I am glad that I got the skills that I did because I feel like the only thing that is holding me back with Unreal is my hardware.  

The Best Laid Plans of Mice and Men…

Big plans. That’s what I had this week for my Unreal Project this week. I was going to get some new animations of a one-armed man loading a saddle onto a horse and then dramatically turning to the right or left as if hearing something in the distance. I was going to finally fix those crazy proportions in my scene (goodbye abnormally large bison). I was going to upload a new horse I found in Unity that had actual texture. I was going to find a good sample of a coyote howling off in the distance and match it perfectly with my new animation. What did I get instead:

Jesus christ….To quote Beckett, “Try again. Fail again. Fail better.” Unfortunately, I got stuck on the second sentence of that quote. Here’s how it happened.

I’ll spare you the details of the lab as you were there for it. I will say you definitely had my sympathy. Can’t imagine the pressure of a software not working when 20+ students are coming to use it and base their homework off of it. Had the software been functioning properly, I was interested in getting one animation: a one-armed model walking up to a horse, picking up a saddle (it would’ve been a sandbag in this case), propping it up onto a horse (stool), adjusting it for a few seconds before “hearing something in the distance” (turning their hard sharply). I definitely plan on trying to get that in the future.

Unable to get the animation I was looking for, I decided to pivot and use a simple walking animation we cleaned up last week. Unfortunately, that walking animation didn’t end in a t-pose but was instead abruptly cut off mid stride. I figured this was all exploratory though so I made a new Mixamo character, put it in motion builder, characterized those hips and brought the animation into my Unreal scene. My poor Mac, the more I give it the more I struggled. Rather than a full character, I got a choppy one who was missing parts of his head, shoulders and waist. It would have to do.

Next came the part that I was sure would be the simplest and would be the most difficult: causing the walking animation to trigger a sound clip. There’s no reason this should’ve been as hard as it was. First, I tried a tutorial I found online but after about an hour of the coyote playing every time the scene started (I checked the blueprint thoroughly), I decided I’d go with your video from the last class. I thought that even though you focused on turning on a light, triggering a sound wouldn’t be too different. I played around with different nodes but couldn’t have any success. I’d either hear the coyote from the beginning of the scene or never at all.

Frustrated, I called Kat over and we sat looking over Unreal for a good 20 minutes. She instructed me to insert a console log that would display a message when the character walked through the trigger. We tried that but still had zero success. At this point we both called it quits as it was around 8 and I’d been on the floor struggling with that other gaming software for hours.

Deep down, I fuckin’ knew it. I knew there was something with the character not being a trigger. That’s the only reason that could explain the console log message not appearing. At the end of the day, I’m glad the situation was resolved even if it meant toiling over Unreal for hours only to have my problems be solved literally by the click of a button.

I really do look forward to making this scene what’s in my head: an exploration of subtlety.

Adding Animation

Lab:

This past Friday Chris Hall, Manning Qu, Oriana Neidecker and I got together to clean up our data. Each of us took turns sitting in front of the computer and getting to know the software a bit better. We started by bringing a simple walking animation Chris did with her group the week before. Due to her hair being in a pony-tail, a lot of the data around her neck was missing which was a good thing because it let us figure out how to use the software properly to clean up the data.

Dirty animations
clean animations

After the walking animation was cleaned up, the other three members head out and I was able to take a look at a horseback riding animation that I had recorded the previous week. The good news was that this information was totally clean and required no additional work but this – may have – led to complications down the line.

Workflow:

Overall, the process of using MotionBlender was pretty easy thanks to Todd’s great tutorial. The one thing that was a bit over my head was that the wrists don’t seem to be animation properly. In the attached video you can see that the forearms and wrists aren’t operating in the way that the skeleton is. Rather than fold at a 90 degree angle, the right arm juts out a bit more which makes it look like the character is petting something next to him as opposed to his own horse. 

 

Final video:

While I’m happy with the fact that the animation plays well, there are definitely more con’s than pro’s in this first video; mistakes that can be chalked up to my first time using the software’s. For example, the scale of each of the different objects – the character, horse and bison – are all very off as is the scale of the foliage. The graphics card in my Mac is also preventing the character from appealing smoothly, instead it looks like a character from Goldeneye. 

With that said, I’ve really enjoyed the process and look forward to learning more about the various softwares we’re using. 

Butcher’s Crossing and Lab Documentation

John Williams’ Butchers Crossing is a novel I have a long history with. 

I first read the book in 2010 and became obsessed with it. The chaotic way nature was represented was unlike any other western I had ever read or seen before. Williams’ denial of Emerson’s idealism and descriptions of the protagonist Will as he came face to face with futility had me quite enamored. So much so that I started a long process of trying to acquire the rights to make the book a film. I’ll spare the details but nothing ever became of my attempts and two years ago I had to quit the conquest. 

When we were given the opportunity to create an environment in Unreal, my mood board was about a preconceived idea about baseball. That was until Butcher’s Crossing came back into my brain and I couldn’t resist the urge to have another go with the text. 

The atmosphere I was looking to represent was two-fold. I wanted one that had some warm colors, that appeared as if it was occurring at dawn to connote a sense of newness. Unfortunately, my limitations with the software presented me from adorning the landscape with other features – different sorts of foliage, more muted colors, a more craggy “cliff”. I wanted this first part to convey a sense of stumbling onto something untouched and private.

My other atmospheric goal – and one in which I feel I didn’t succeed so well – was to convey a slight touch of danger; some sort of harbinger of ill-will. I put a light fog over the scene and sprinkled in some sharp rocks in an attempt to convey this. I also made sure some of the lake was touched by the sun to give it a sort of “on-fire” look.

At the end of the day, while I would like to make a few changes – a slightly more ominous atmosphere, a better looking horse, a few cowboys – I was really happy with what I was able to create with just a few days of UE4 under my belt. 

As a result of using source material, I have three characters that have already been created and fleshed out for me. They are:

Will: A 23 year-old Harvard drop out who has ventured out to “find himself in the great West”.

Miller: A surly hunter who is eager for wealth.

Charley Hoge: A one-armed veteran and companion of Miller’s. 

Fred Schneider: The stubborn voice of dissent. 

Part Two: Lab

This past Saturday Izzy, Ran, Spencer, Manning, Teresa and I got together to gather some movement data. 

Izzy and I were a little early so we calibrated the space without any issue and began to suit up.

Manning, Ran and Teresa handled most of the software while sporadically aiding Spencer in getting all of the sensors on Izzy and I. 

Once we were all suited up and skeletons had been made of us, Izzy and I began running though various movements. 

We had conversations to get natural hand gestures, we climbed on ladders, we rode pretend horses (really excited to see that one), and ‘went for a swim’. We did some flying, some dancing and some crawling around in the sewers (per Teresa’s request). 

Each member of the group made known what specific gestures they needed (Manning needed some jumping, I needed the horseback, etc), we saved and stored the data and went about our ways. I think I speak for all of us when I say we’re all eager to see what it looks like and integrate it into our environments!

Mood Board and MoCap

Here is a link to my mood board.

The mood board ranges from Gregory Crewdson photos to Minor League Mascots.

I’m interested in the dilapidated suburbia vibe. Not one with nefarious, Lynchian undertones but one with an awkward, painful sort of hope; one that doesn’t take itself too seriously and has a DeLilloesque sort of absurdity to it. In this vibe exists the fictional town of Equator, MO, an oppressively hot suburb where a minor league pitcher is trying to work his way up the minor league ladder. In his way is a slew of mechanical issues that only the user can fix.

As of now there are effectively two baseball video games on the market: MLB The Show and RBI Baseball. The latter features minimal pitching mechanics while the former takes it a step further but fails to take specific atmospheric steps. For example, an entire pitchers arsenal is open to you in MLB The Show, but the mechanics that go into that arsenal are in no way covered. This is the void I am interested in filling.

I want to work with actual minor league pitchers, capture their movements and develop a game that focuses on the minutia. I’m not sure if you’re familiar with the mechanics of pitching but so much goes into it: arm slot, foot placement, grip, etc. This game would not only explore those fields but also serve – I think – to explore the depth in which motion capture can go. The more in tune you get your mechanics, the more success you will have.

As of now, I’ve made contact with the New York Mets affiliate, the Brooklyn Cyclones, who appear to be interested in working with me in making this project. If their interest appears to be as sincere as they let on, I may pivot to helping them use motion capture to establish a training tool to make their pitching staff more effective.

BONUS: Calibratin' the space!

Just call me 'wanda' sykes
For your MoCap!
does it get better than exceptional?