[Chimera-users] 3D Viewing Collaboration
goddard at sonic.net
Thu Jun 30 17:00:26 PDT 2016
I'll send you a link to a prerelease ChimeraX. The program has many fewer capabilities than our production Chimera software, so you would get a better idea of what is possible by looking at UCSF Chimera. The ChimeraX Oculus Rift support is not in Chimera, also it only works on Mac OS which has not been supported by Oculus for the past year. We are just now getting our Windows ChimeraX version working and we will update to the latest Oculus API used on Windows, probably this summer.
I liked the anatomy HoloLens demo. Molecular visualization is harder to understand as the mechanisms and functions of molecules are not even grasped by the experts. Still I could see an interesting demonstration on say the architecture of HIV virus or Zika virus along the same lines as the anatomy demo. I know from making animations that producing the content is a much bigger job than implementing the rendering capabilities. For instance the following 5 minute HIV RNA animation took me a month to create
while adding support for viewing in Oculus Rift took about a week. So the bottleneck in making high-quality teaching visualizations is preparing the content, not supporting new technologies like HoloLens. It is important to appreciate that when doing the technology proof of concept development.
> On Jun 30, 2016, at 7:22 AM, Provance, Jeremy wrote:
> Hi Tom,
> Your Oculus work is super neat. Agreed about the developer kits—we’re using the less functional free-ware until our use mandates purchase. We do a lot of R&D “proof of concept" type work alongside bread and butter informatics. If we can show stakeholders what 3D visualization capabilities exist they are more likely to move in that direction.
> I do see us eventually using 3D instead of 2D in a lot of classroom work but like you said it will take advances in the softwares to make it seamless. This video <https://www.youtube.com/watch?v=eEUXuyPyD0k> was made by my boss showing anatomy capabilities possible for a medical student. As medicine and sensor ability progresses we see this being a real patient with real-time anatomy someday. The applications are endless.
> The Chimera X software sounds intriguing and I would be interested in what it can do. Is there a demo version I can access? We’re honestly just seeing how far we can go into visualizations at many different levels with what current technology allows.
>> On 29 June 2016, at 21:13, Tom Goddard wrote:
>> Hi Jeremy,
>> We are interested in molecular visualization with virtual reality headsets and have been trying it for a couple years with Oculus Rift developer kits 1 and 2 with our next generation ChimeraX software. Here's a description of what we tried a few years ago:
>> http://www.cgl.ucsf.edu/chimera/data/oculus-jan2014/oculus.html <http://www.cgl.ucsf.edu/chimera/data/oculus-jan2014/oculus.html>
>> We have not tried augmented reality offered by Microsoft HoloLens where you see both your natural surroundings with elements like giant molecules added by the computer. (Discouraging that their developer kit is so expensive, $3000, oculus dk1 was $300). I don't have a clear idea why viewing molecules in our natural surroundings makes sense. I guess for a classroom where all the students have headsets it allows the students to see each other so it would be better for discussion. A basic element would be that all the headsets would be viewing the same scene, although they are probably each connected to a separate computer. So a challenge of implementing this is syncing the scene across multiple computers. A start-up company in San Francisco called High Fidelity is making an open source multiplayer virtual reality platform to do this kind of thing and I saw a demo a month ago but using immersive headsets where you don't see your surroundings. Each participant has a human-like avatar that others can see in the scene.
>> Our ChimeraX software is not yet released but it can do the needed stereoscopic rendering. But maybe what you need is instead molecular models exported in 3d format to use in custom software you write to coordinate the views of multiple participants. I'm happy to help. Tell me what you envision our role would be in your project.
>>> On Jun 29, 2016, at 8:05 AM, Provance, Jeremy wrote:
>>> My name is Jeremy and I work at the University of Missouri—Kansas City School of Medicine. To cut to the chase, we’re doing R&D work with Microsoft HoloLens as it applies to medicine in both macroscopic (telemedicine) and molecular (pharmaceutical compound visualization and manipulation) applications.
>>> I’ve been using Chimera for some time in my academic explorations and wanted to contact you all to start a conversation about how we might collaborate to make molecular manipulation in 3D a reality. I’m imagining importing a protein and coloring areas of polarity and hydrophobicity, real-time, as the protein floats in the space of a classroom, as the proof of concept. Let me know if this is something you all would be interested in working together on.
>>> Jeremy Provance <https://twitter.com/jeremyprovance> | Software Analyst, Center for Health Insights <http://chi.umkc.edu/>
>>> 2411 Holmes Street, Kansas City, MO 64108 | MG-204A
>>> (816) 235-1938 | provancej at umkc.edu <mailto:provancej at umkc.edu>
>>> Chimera-users mailing list: Chimera-users at cgl.ucsf.edu <mailto:Chimera-users at cgl.ucsf.edu>
>>> Manage subscription: http://plato.cgl.ucsf.edu/mailman/listinfo/chimera-users <http://plato.cgl.ucsf.edu/mailman/listinfo/chimera-users>
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the Chimera-users