Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Musical context for mouse events #1050

Open
gdicristofaro opened this issue Jul 4, 2021 · 3 comments
Open

Musical context for mouse events #1050

gdicristofaro opened this issue Jul 4, 2021 · 3 comments
Labels

Comments

@gdicristofaro
Copy link

Hello,

For a project I was working on using VexFlow, I was looking to get some musical context for mouse events (i.e. which pitch on stave, closest measure, nearest note, etc. would the mouse event correspond). I put together a simple demo here and the source code is here. I wasn't sure if you would have any interest in integrating this or something similar into VexFlow, but I could work up a pull request if you would. Thanks for all your work on this. I really enjoy this project.

Greg

@ronyeh
Copy link
Collaborator

ronyeh commented Jul 4, 2021

Pretty cool demo! (I'm not the maintainer, but) I think it'd be interesting to someday support some level of interactivity via the vexflow library. Mouse clicks and finger taps to select notes, or even animations or a built-in "cursor" where you can highlight the current note(s) and advance forward and backward.

@AaronDavidNewman
Copy link
Collaborator

Smoosic is a library that does this, in a similar way. It also tracks with cursor keys, if you are on a keyboard device. See tracker.js and mapper.js.

One thing working against pushing this kind of thing into VF is that you need to create some type of musical object model to express what is rendered, and the VF object model is very rendering-specific, and very application-agnostic.

It might make sense to have a project that just does the mapping and tracking though, so every project doesn't have to reinvent the wheel. You'd have to create some type of data contract about how to get the music information in a model-agnostic way. The logic of calculating and managing the bounding boxes is probably pretty similar, regardless of the musical model or the underlying device.

Here is a fiddle of the Smoosic tracker:

https://codepen.io/aarondavidnewman/pen/WNoRqgg

And the full project:

https://github.com/AaronDavidNewman/Smoosic

@gdicristofaro
Copy link
Author

Hello to both of you and thanks for your thoughts! I was attempting to emulate some of the behavior I had seen in notation software where a note that you add at a certain point would be based on the context. For instance, the y position relative to the stave would determine the pitch and the x position relative to other notes would determine the beat. I was aiming to provide just the context part with this code, but expand upon it later for my own project. Concerning the data model, there may be a more VexFlow specific way of expressing some of these things (i.e. pitches, beats) that could be changed if that would be best.

@ronyeh ronyeh added the future label Sep 30, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

3 participants