Monday, August 8, 2016

Day 22

This whole week will be short for me due to band camp, but before leaving I began work on the Virtual Reality and Python portions of our presentation.

Day 21

Today was a short day for me, as I had to leave to participate in a parade. Otherwise, we figured out a title for our presentation and turned that in along with a heavily modified abstract.

Day 20

Today I finished the 3D gallery and presented it to Gabe, who thought it was great. We also polished up our abstract to turn in tomorrow.

Thursday, August 4, 2016

Day 19

Today I continued work on the 3d environment, including getting the high quality picture scans and getting them ready to be created in the environment as pictures. I also took part in a little quality experiment, which had me look at two pictures and determine which had a higher quality.

Tuesday, August 2, 2016

Day 18

Today, some of the interns brought their friends, and so the reading room was packed all day. Also, we had a peer-review of our outlines today. We also went to Salsarita's for lunch. As for me, I continued work on the virtual space, creating a small mock-up of what a gallery may look like in virtual reality.

Day 17

For the first half of the day, I worked on Vizard and started to look at the possibility of creating the final 3d model for the virtual space. For the second half of the day, I resumed work on the AOIs for the experiment.

Friday, July 29, 2016

Day 16

The first half of our day today was taken by a field trip to the Eastman House. We went on a tour of one of the exhibits, and then also got a private tour of the conservator's lab. Needless to say, the trip was awesome and informative, even though it didn't really have much to do with what I am working on currently. We then went to lunch at Amiel's which was delicious. When I got back, I continued work on vizard and the virtual reality space.

Wednesday, July 27, 2016

Day 15

Today we started fully working on Python and Vizard. I created a simple scene with some pre-made 3d models, and animated some plants to move around and spin. I also learned about cull faces, which are faces of geometry on screen that is opposite of the normal face. These cull faces can be turned off if the player doesn't need to see one side of an object on screen, and to improve performance.

Day 14

For the first half of the day we looked at and organized more data, and got a little bit farther in the mobile data. For the second half of the day, I was introduced to vizard, which is the software the perform lab uses to create virtual environments for use with a VR helmet.

Day 13

Today we were informed that we were going to start working in the perform lab tomorrow with virtual reality, which was exciting. Other than that, the day was filled with looking at data.

Friday, July 22, 2016

Day 12

Today we finished up our with our last subject for now, and then started to dive fully into analyzing data. Unfortunately, another experiment was scheduled for today, and thus we couldn't use the lab for too long today.

Wednesday, July 20, 2016

Day 11

Today we finished up with Group 1 of testing, and have started sorting through the data we collected for them. Also, I talked with Chris Kanan about computer science and machine vision, and he gave me some very insightful information.

Tuesday, July 19, 2016

Day 10

Today we went through another round of trials in the morning, which went very smoothly. We also started to put the framework in to analyze our data fully, including making more spreadsheets and starting to draw AOIs, or Areas Of Interest, on our video recordings.

Day 9

Today we started the first wave of trials for our experiment, which went very well. We also briefly looked over the results we got, and I started to construct a spreadsheet of the data.

Friday, July 15, 2016

Day 8

Today we started setting up the non-computer portion of our experiment, and did a dry run of that segment. We also started to learn about areas of interest and how we can use them to speed up our data analysis.

Abstract


Eye tracking and studies involving eye tracking have a very long and complicated history. Traditionally, observers taking part in a study of the eye involving a remote eye tracker were restricted in their mobility, often having their head and body confined and looking at stimuli on a computer to get the best possible readings. But recently, the mobile eye tracker has become increasingly popular for use in eye tracking studies. The observer is now able to move around, and the stimuli chosen has become far more akin to a real life situation. Unfortunately, data taken using mobile eye trackers as opposed to remote eye trackers is less accurate, and the convenience of setting up an experiment in a fully controlled environment is traded for accuracy in real life. Furthermore, mobile eye trackers are much more costly and time-consuming than a remote eye tracker. The purpose of our study is to determine whether there is a measured difference in the eye movements and reaction to stimuli in a real life scenario as opposed to a virtual scenario. Moreover, in the presence of a difference, we will determine the viability of virtual reality technology as a middle ground between the convenience of a virtual setting and the actuality of a real setting. To perform this experiment, we will be using eye tracking devices and software from SMI to collect and analyze our data. In the future, we hope that the information we gather will be useful in experimental design using eye trackers.

Day 7

Today we finalized our experiment design and worked on our abstract in the morning. Then, we went out for a nice lunch at global village for an hour. When we came back, we did a dry run of part of our experiment, which I can't spoil because some of the other interns will be our observers.

Thursday, July 14, 2016

Day 6

Today we made a lot more good progress on our experiment. We finalized our stimuli for use in the experiment, and actually finished up our experiment design in SMI Experiment Center. By the end of the day, we had tested our designed experiment with the remote eye tracker and figured out how to import the data we gathered into our analysis program.

Wednesday, July 13, 2016

Day 5

Today we started to get into designing our experiment. The basic principle is to compare the eye movements between a real scenario and a virtual scenario. We first discussed our questionnaire and what we were to ask our participants. Then, we started to learn about the calibration of the mobile eye tracker and tested it out in one of the campuses many galleries. Then, we started to design our virtual part of the experiment, and we are making good progress on it.

Tuesday, July 12, 2016

Day 4

Today we finally meet the two professors who we will be working with for the next 3 weeks. They introduced us to their idea for an experiment, and asked us to start thinking about designing the experiment and what we want to do with it. We also took a look at the software that we will be using to design experiments on the computer, as well as both the remote and mobile eye trackers. Afterwards, we all went to a talk about salience and its relation to color, which I thought was very enjoyable and informative.

Friday, July 8, 2016

Day 3

Getting into the weekend, I had a more relaxed day today. Our group first discussed papers that we had picked out related to vision, and learned how to used Google Scholar to find academic papers. Then we had an extended lunch, where all of the interns struggled to grill hamburgers and set up a volleyball net to play. After that, we came back and looked at some more papers about both human vision and machine vision. Overall a more relaxed day to take me into the weekend, which I appreciate.

Thursday, July 7, 2016

Day 2

For Day 2, we started out by organizing food accommodations for the picnic tomorrow. I don't have to bring anything in yet, but I was warned that I would have to bring in something next time. After that, my group went to the Virtual Reality lab and got debriefed on how virtual reality is being used to support human vision work in the rest of the lab. After that, we went to boot camp again. and learned about a new eye tracker that Jeff was producing. After that was lunch, in which I had a turkey sandwich. Not quite as good as yesterday for sure. Then, for the rest of the day, we each got video taken of our eyes, and Jeff explained how and why he used the apparatus he did to take the video. Overall a great day.

Wednesday, July 6, 2016

Day 1

The internship has started! I am in the Visual Perception lab with three other interns, and at least one student. To start off the day we took a brief tour of all of the labs in the Center, including labs that were not part of the internship, but fascinating nonetheless. After the tour, all the interns were taken to the Red Barn to do some team-building activities. Despise the scorching heat, I had fun meeting my fellow interns and playing games and working through problems with them. After that was lunch, and then we finally split into our groups. I share the lab with Alice and Maria, both very nice girls. From 1:00 to about 4:30, we sat in Visual Perception "boot camp", learning all about the eye, the structure of the eye, and how the eye is related to the brain, and how the eye is controlled by the brain and other muscles. We also briefly explored the relationship between eye movement and task given to an observer, as well as learning about eye tracking technologies. Overall, it was a great first day and first experience for me.