-
Notifications
You must be signed in to change notification settings - Fork 552
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Sorry for bother you again #12
Comments
It seems that the auto-calibration gives you a wrong threshold value for the binarization. That's strange. We need to dig in. Get back to try:
cv2.imshow('Binarized left eye', gaze.eye_left.pupil.iris_frame)
cv2.imshow('Binarized right eye', gaze.eye_right.pupil.iris_frame)
except Exception as e:
print(e) |
I guess I edited my previous message after you run the test. Add the code I gave you, and you should see two new frames on your screen. Take a screenshot and send it to me |
I can't see your picture if you add new replies to this thread by email. Please, go to Github. :) |
Yes, the threshold value given by the auto-calibration system is definitely wrong with you. Before the results_displayed = False And then, in the loop, add that: if gaze.calibration.is_complete() and not results_displayed:
results_displayed = True
print(gaze.calibration.thresholds_left)
print(gaze.calibration.thresholds_right)
try:
cv2.imshow('Right eye', gaze.eye_right.frame)
cv2.imshow('Left eye', gaze.eye_left.frame)
cv2.imshow('Binarized left eye', gaze.eye_left.pupil.iris_frame)
cv2.imshow('Binarized right eye', gaze.eye_right.pupil.iris_frame)
except Exception as e:
print(e) Keep looking at the center for 5 seconds when you start the program. |
Thanks, but I also need a screenshot containing the 4 frames you get. |
Hey :) Did you found anything? |
Is that normal? |
Now it works! I don´t understand! |
Ok, seems there is a problem with the threshold. Everytime I change the light setup behind my camera it works or not. Is the code working also in dark? |
Hi @Blackhorse1988 |
Hey :) its your impressive code :) I changed the threshold manually to the best value, so an autocalibration would be great. I will do that tomorrow and really thank you for you time and support! I am working with that robotic arm in my job and I would like to do that with the glass from pupil labs too. Do you have experience to make the code working for android? |
You did well to pass your own threshold. What value did you choose? With every people I tried, the auto-calibration system was working pretty well. Too bad it's not the same for you. According to your screenshots, the program had calculated consistent values for which it obtained good iris sizes. So, you shouldn't get binarized frames with only white pixels. In worst cases, you should have a wrong iris binarization, but with some black pixels. I'm a little bit surprised and I would like to understand why, so I will keep you in touch after having done tests with your samples. OpenCV works on Android, so it seems to me that it would be possible to run a version of this code on a mobile, with Java/Kotlin, or maybe with tools that make possible to run Python code. About Pupil Labs, I don't know well, I've never tried it. |
Hi Antoine,
I chosed around 100 and in the evening 70 and it worked well.
Thank you very much, I would like to progress with your great code and if
you want you can help us as supplier ofrbotic arms to disabled people. This
is the background of all that.
I have created an application and I would like to run this on android. Do
you have experience in that programming? As I said I am a rookie in python,
opencv etc. and more an allrounder. I would like to combine an eye control
like that with my application because then I don´t need an extra interface
and i can use my developed bluetooth module to transmit the data to the
microcontroller. I also would pay a value if you could help, we can talk
about.
Best regards
*Peter, Schieder *Staatl. gepr. Medizintechniker, Assistive Technologies,
EMEA
0160-98051275
*(Mobil)*[email protected]
*(email) *
Kinova Europe GmbH
Friedrich-Ebert-Allee 13
53113 Bonn / Deutschland
*kinovarobotics.com* <http://www.kinovarobotics.com/> Facebook
<http://facebook.com/kinovarobotics/> Twitter
<http://twitter.com/kinovarobotics> LinkedIn
<https://www.linkedin.com/company/kinova/> YouTube
<https://www.youtube.com/user/Kinovarobotics>
<https://www.youtube.com/user/Kinovarobotics>Geschäftsführer: Tommy
Swigart, Peter Fröhlingsdorf;
HRB 229813 Amtsgericht München
Diese E-Mail und eventuelle Anhänge sind nur für die genannten Empfänger
bestimmt und können vertrauliche Informationen enthalten. Wenn Sie nicht
der richtige Empfänger sind, unterlassen Sie bitte das Lesen, Kopieren, die
Benutzung oder die Weitergabe dieser Informationen an Dritte. Bitte
verständigen Sie den Absender durch Rückantwort oder telefonisch unter
+49-(0)-228 / 9293-9148 über den irrtümlichen Erhalt dieser E-Mail. Löschen
Sie bitte anschließend die E-Mail und hiervon gegebenenfalls existierende
Kopien. Diese Informationen können dem Datenschutz unterliegen oder
anderweitig geschützt sein. Vielen Dank!
Am Mi., 3. Apr. 2019 um 01:14 Uhr schrieb antoinelame <
[email protected]>:
… You did well to pass your own threshold. What value did you choose?
With every people I tried, the auto-calibration system was working pretty
well. Too bad it's not the same for you. According to your screenshots, the
program had calculated consistent values for which it obtained good iris
sizes. So, you shouldn't get binarized frames with only white pixels. In
worst cases, you should have a wrong iris binarization, but with some black
pixels. I'm a little bit surprised and I would like to understand why, so I
will keep you in touch after having done tests with your samples.
OpenCV works on Android, so it seems to me that it would be possible to
run a version of this code on a mobile, with Java/Kotlin, or maybe with
tools that make possible to run Python code.
About Pupil Labs, I don't know well, I've never tried it.
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#12 (comment)>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/Au49t7MtCtn42nGzf4JJyYHQWKBdnLMVks5vc-RPgaJpZM4cWj_2>
.
|
If I would mount a camera to a glass and just detect one eye, would that work too with your code? |
Hi Antoine, do you have an idea to modify the code to track just one eye? |
Hi Antoine, |
Hi @Blackhorse1988 In int(self.eye_right.pupil.x)
int(self.eye_right.pupil.y) In self.eye_right = Eye(frame, landmarks, 1, self.calibration) Then update Happy coding! |
Hi @nishthachhabra |
Hi @antoinelame, I'm having a problem similar to @Blackhorse1988, because most of the time the application only idendates winks and looks to match. In a few times he identifies looks to the right. I could not at any point identify looking up or down. |
Hi @EderSant, Yes you're right, the wink detection is perfectible and I will improve it. However, the "blinking state" doesn't mean that the program doesn't know if you are looking on the right or on the left. If you are using the demo, the display of "blinking" overrides the display of the gaze direction. In this case, you can just remove theses lines in if gaze.is_blinking():
text = "Blinking" About the identification of up/down, I didn't write Please new time open your own issue. 😉 |
well the eye tracking works fine when i set
with -2 no pupils are detected.
But the tracking doesn´t follow the pupils when I move just the eyes.
When I move my head it works.
Do you have an idea?
The text was updated successfully, but these errors were encountered: