Monday, August 8, 2016
Thursday, August 4, 2016
Today I continued work on the 3d environment, including getting the high quality picture scans and getting them ready to be created in the environment as pictures. I also took part in a little quality experiment, which had me look at two pictures and determine which had a higher quality.
Tuesday, August 2, 2016
Today, some of the interns brought their friends, and so the reading room was packed all day. Also, we had a peer-review of our outlines today. We also went to Salsarita's for lunch. As for me, I continued work on the virtual space, creating a small mock-up of what a gallery may look like in virtual reality.
Friday, July 29, 2016
The first half of our day today was taken by a field trip to the Eastman House. We went on a tour of one of the exhibits, and then also got a private tour of the conservator's lab. Needless to say, the trip was awesome and informative, even though it didn't really have much to do with what I am working on currently. We then went to lunch at Amiel's which was delicious. When I got back, I continued work on vizard and the virtual reality space.
Wednesday, July 27, 2016
Today we started fully working on Python and Vizard. I created a simple scene with some pre-made 3d models, and animated some plants to move around and spin. I also learned about cull faces, which are faces of geometry on screen that is opposite of the normal face. These cull faces can be turned off if the player doesn't need to see one side of an object on screen, and to improve performance.
For the first half of the day we looked at and organized more data, and got a little bit farther in the mobile data. For the second half of the day, I was introduced to vizard, which is the software the perform lab uses to create virtual environments for use with a VR helmet.
Friday, July 22, 2016
Wednesday, July 20, 2016
Tuesday, July 19, 2016
Today we went through another round of trials in the morning, which went very smoothly. We also started to put the framework in to analyze our data fully, including making more spreadsheets and starting to draw AOIs, or Areas Of Interest, on our video recordings.
Friday, July 15, 2016
Eye tracking and studies involving eye tracking have a very long and complicated history. Traditionally, observers taking part in a study of the eye involving a remote eye tracker were restricted in their mobility, often having their head and body confined and looking at stimuli on a computer to get the best possible readings. But recently, the mobile eye tracker has become increasingly popular for use in eye tracking studies. The observer is now able to move around, and the stimuli chosen has become far more akin to a real life situation. Unfortunately, data taken using mobile eye trackers as opposed to remote eye trackers is less accurate, and the convenience of setting up an experiment in a fully controlled environment is traded for accuracy in real life. Furthermore, mobile eye trackers are much more costly and time-consuming than a remote eye tracker. The purpose of our study is to determine whether there is a measured difference in the eye movements and reaction to stimuli in a real life scenario as opposed to a virtual scenario. Moreover, in the presence of a difference, we will determine the viability of virtual reality technology as a middle ground between the convenience of a virtual setting and the actuality of a real setting. To perform this experiment, we will be using eye tracking devices and software from SMI to collect and analyze our data. In the future, we hope that the information we gather will be useful in experimental design using eye trackers.
Today we finalized our experiment design and worked on our abstract in the morning. Then, we went out for a nice lunch at global village for an hour. When we came back, we did a dry run of part of our experiment, which I can't spoil because some of the other interns will be our observers.
Thursday, July 14, 2016
Today we made a lot more good progress on our experiment. We finalized our stimuli for use in the experiment, and actually finished up our experiment design in SMI Experiment Center. By the end of the day, we had tested our designed experiment with the remote eye tracker and figured out how to import the data we gathered into our analysis program.
Wednesday, July 13, 2016
Today we started to get into designing our experiment. The basic principle is to compare the eye movements between a real scenario and a virtual scenario. We first discussed our questionnaire and what we were to ask our participants. Then, we started to learn about the calibration of the mobile eye tracker and tested it out in one of the campuses many galleries. Then, we started to design our virtual part of the experiment, and we are making good progress on it.
Tuesday, July 12, 2016
Today we finally meet the two professors who we will be working with for the next 3 weeks. They introduced us to their idea for an experiment, and asked us to start thinking about designing the experiment and what we want to do with it. We also took a look at the software that we will be using to design experiments on the computer, as well as both the remote and mobile eye trackers. Afterwards, we all went to a talk about salience and its relation to color, which I thought was very enjoyable and informative.
Friday, July 8, 2016
Getting into the weekend, I had a more relaxed day today. Our group first discussed papers that we had picked out related to vision, and learned how to used Google Scholar to find academic papers. Then we had an extended lunch, where all of the interns struggled to grill hamburgers and set up a volleyball net to play. After that, we came back and looked at some more papers about both human vision and machine vision. Overall a more relaxed day to take me into the weekend, which I appreciate.
Thursday, July 7, 2016
For Day 2, we started out by organizing food accommodations for the picnic tomorrow. I don't have to bring anything in yet, but I was warned that I would have to bring in something next time. After that, my group went to the Virtual Reality lab and got debriefed on how virtual reality is being used to support human vision work in the rest of the lab. After that, we went to boot camp again. and learned about a new eye tracker that Jeff was producing. After that was lunch, in which I had a turkey sandwich. Not quite as good as yesterday for sure. Then, for the rest of the day, we each got video taken of our eyes, and Jeff explained how and why he used the apparatus he did to take the video. Overall a great day.
Wednesday, July 6, 2016
The internship has started! I am in the Visual Perception lab with three other interns, and at least one student. To start off the day we took a brief tour of all of the labs in the Center, including labs that were not part of the internship, but fascinating nonetheless. After the tour, all the interns were taken to the Red Barn to do some team-building activities. Despise the scorching heat, I had fun meeting my fellow interns and playing games and working through problems with them. After that was lunch, and then we finally split into our groups. I share the lab with Alice and Maria, both very nice girls. From 1:00 to about 4:30, we sat in Visual Perception "boot camp", learning all about the eye, the structure of the eye, and how the eye is related to the brain, and how the eye is controlled by the brain and other muscles. We also briefly explored the relationship between eye movement and task given to an observer, as well as learning about eye tracking technologies. Overall, it was a great first day and first experience for me.