Dana Thornton: Actually, it’s funny, it all started with the idea of putting a tactile transducer onto the bottom of the jurors chair. I just kind of threw out that we could roll a chair out with a thing that vibrates. That’s kind of how it all started and then it just kind of snowballed from there. We wanted to be able to show that, given the terrain features that this vehicle went across, that if you were asleep you would have woken up.
Daniel Koch: We wanted to give somebody the actual feeling of them being in that cab. If you can put somebody in a cab of a semi-truck or a vehicle and give them as close a representation to what really happened, it’s more powerful than words.
Daniel Koch: There’s a lot of companies out there that produce hexapod platforms, but this system suited our specific needs because of the load capacity and the level of movement.
Dana Thornton: After some digging we actually found a guy that’s in New York, Sarnicola Systems, Inc. He actually had one sitting in his warehouse that he used, or he had… I think he had rented out and then had just sitting there ready to use. So we had two things happen, we had this guy that had one handy and was willing to rent the platform to us.
Daniel Koch: At first was, let’s put a seat on it. Then, let’s put a steering wheel on it. Well, let’s put pedals on it.
Dana Thornton: I pictured this thing looking like a skeleton of a truck with a bunch of things welded onto it and it wouldn’t be very realistic or convincing to someone. I said, you know, for all that, I might as well slap the cab of a truck on there.
Daniel Koch: You can answer so many more questions and convey to a jury or to a client, or even a test subject, that what they experienced was severe or wasn’t severe, it depends on the parameters of the case.
Dana Thornton: There was a lot that went into accurately reproducing this motion in our platform. In order to do it, first of all, we needed to make sure that we had an accurate model of the vehicle itself. Our engineers went out to the scene of the accident and we surveyed and aero scanned the terrain. We gave this data to our visual staff who then was able to make a 3D environment of that terrain.
Neal Carter: So it took data from the scene, because we scanned the scene, and also data from the instrumentation that was on the truck. I used that to validate a model an HVE. I matched the acceleration, I matched the vehicle motion, the speeds, and the wheel displacements. Once I had that validated model, what I could do, I could simulate that validated model through the accident scene. And then we were able to take this take, take the motion from that simulation, and export that and feed that into the motion platform.
Daniel Koch: I worked with Gray on instrumenting a actual semi-truck and driving it over test terrain so that we could properly calibrate the HVE model.
Dana Thornton: Gray Beauchamp and Dan Koch really were involved in that process, instrumenting the vehicle with potentiometers, VBox, so that our virtual… the dynamics of our virtual vehicle were going to match the dynamics of the real world vehicle.
Daniel Koch: Sight’s a very powerful tool, and if you can just add that extra little bit onto it, you could show somebody that this is what it looked like to be the driver of this accident.
Dana Thornton: The VR is important because it gives a level of immersion to a participant. If someone is sitting in that cab and trying to… if we’re trying to relay what that experience was like, they need to believe that they’re in there. One of the ways to do that is to immerse them in an environment. And you can see all those aspects and you can feel all those aspects, and all together it becomes a very powerful statement.
So how do you feel? Do you have any motion sickness at all?
No. How was the overall experience? Seem pretty much integrated or…?
Dana Thornton: With the VR, you’re in that truck. You can look down and you see the steering wheel. You look to the right and you see the world going by out your window. Same out the left. The VR became important, basically for that level of immersion, the depth of immersion. So many people were involved in this project. I got my hands involved in a lot of aspects of this case, but I still think that everybody else worked a lot harder than I did. I could not have done this with Gray and Dan taking care of the validation study. Neil Carter did a tremendous amount of work in HVE. David Hessel did a lot of work for us. He’s a genius, I’m pretty convinced. I’m sorry if I haven’t mentioned somebody specifically. For everybody that did help, thank you, because, I mean really, this project was a Kineticorp wide project. We have a very team structure, silo structure they call it, with the different teams, and this project spanned all of those teams. This was a good example of how we, as a company, we still have that ability… you got to pull together and pool our resources for big project like this.