Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implement interaction visualization #692

Open
MartinBaGar opened this issue Dec 16, 2024 · 2 comments
Open

Implement interaction visualization #692

MartinBaGar opened this issue Dec 16, 2024 · 2 comments

Comments

@MartinBaGar
Copy link

I created a blender extension to visualize interactions occuring in a molecular dynamics simulations. After a quick chat with @BradyAJohnston and one Blender reviewer, we though of implementing this extension directly into Molecular Nodes.
Let's discuss it here !

The code is curently hosted on my repo Viber.

It is quite simple and it doesn't use python librairies external those of the version used by blender.
There are actually two steps that are necessary visualize the interactions, the first is external to Blender and the extension and the second is the one handled by the extension.

  • The first one consists in identifying the interactions occuring in the simulation and storing this information into a JSON file. I give mini-tutorial in a jupyter notebook stored on my repo that should help you generating this JSON file.
  • The second one simply reads the JSON file, identifies interactions (e.g., between Atom 1 and Atom 2), and uses Blender's API to create an "interaction" object that has the same coordinates as the vertices 1 and 2 (for Atoms 1 and 2) in the Blender model generated by Molecular Nodes. For each frame, a Blender frame handler will update "interaction" object coordinates by following the corresponding vertices in the MN object.

I think that the first step should stay external as everybody can have its own method for detecting interactions. Having a universal JSON format allows each method to be visually represented in Blender.
The second step might be the most important one for the implementation.

On the cosmetics part, it currently different materials for the most frequent interactions (hydrogen bonding, salt-bridge, pi-stacking, pi-cation) but it can be enhanced to support many more interactions.

Is there something you want to consider in priority for this implementation ? If something is not clear, please tell me !

@BradyAJohnston
Copy link
Owner

Thanks for wanting to contribute! I think this would go well with the overall add-on and help with more potential visualisations of MD data & analysis.

Regarding a couple of your points:

  • Identifying interactions: we could in theory run any arbitrary calculation when each frame is changed, detecting new interactions during playback and not requiring the import of a .json. I don't know what performance is like, and as you say people have their own methods for determining that so simplest to just enable import (plus it would add more python package dependencies). If ther was a method that was clearly 'the best & most used' then later on we could implement into MN to run 'live', but that can be done later & re-use the existing code from everything else.

  • I think your approach is the right one for visualising the bonds. Having the object be separate to the trajectory that is being visualised will be good. Some potentially for going out of sync if users transform the trajectory object, but that isn't very common practice for MD visualisation anyway.

I'm going to do some more technical write-ups for how everything works over the next couple of days (after I'm done some refactoring), but briefly this is how I think it would fit in to everything:

There is a MNSession object that is part of the scene, which keeps track of all of the entitities that have been imported. This is part of the individual .blend file and is saved & reloaded. All other python objects are lost between saves, so anything we want to stick around has to be kept in this session:

bpy.context.scene.MNSession.entities -> Dict[str, Union[Molecule, Trajectory, Ensemble, Density]]

This dictionary is where the objects are stored so they hang around and can be used later in the session or between save & reload cycles.

There is a single post_frame_change handler function which iterates over the entities and calls their .frame_set(context.scene.frame_current). To have the visualisation object (unsure about name) be part of scene, we should just have to subclass the MolecularEntity class. Upon creation it automatically gets registered with the session for saving & per-frame-updates.

The visualisation entity can then just have a property which links to the Trajectory, to source it's currently up to date positions for updating the positions.

@BradyAJohnston
Copy link
Owner

tagging @yuxuanzhuang who I am sure is interested. They have already started creating https://github.com/yuxuanzhuang/ggmolvis which is in the same vein, and we are going to be working on a bunch of analysis visualisation stuff in the future

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants