h1

9. Stereometry

October 17, 2010

Stereo imaging is the method for reconstructing the 3D properties of an object by collecting information from images captured in 2D.  This method can have important applications in recording and/or finding elevations and depths in images.

Figure 1 shows the setup of the technique wherein the two cameras (or two views of the same camera) are located at distance b from each other.  The images of the object are projected at planes x1 and x2, which are both at distance f from the axis where the center of the camera is located.  By similar triangles, we will find that x1 and x2 are given by the equations below.

Figure 1. Image geometry for stereo imaging (courtesy of Dr. Soriano [1])

From the equations above we will find that:

The 3D shape of the object can then be reconstructed by performing this on several points.

Camera calibration and stereo recording

We looked at the manual of the camera used in taking the pictures and the focal length f of the camera was found to be 6.5mm. The focal length can also be found in the calibration matrix based on the previous activity on creating a geometric model for 3D imaging.

We captured two images of a Rubix cube, aiming to use the corners of the small squares as reference points.  It was made sure the the camera is able to view the set of vertices and that they lie along the same y-distance from the object. The camera was moved a distance of 4 inches between the two shots.

Figure 2. Images captured from a 3D object (Rubix cube).

3D reconstruction

We gathered as many points we can from the object in xy coordinates corresponding the corners of the small squares of the Rubix cube.  We then calculated for the z value for each point and then plotted the object in a 3D mesh (with an intermediate necessity for interpolation) [3].  Figure 3 displays three different views of the 3D object reconstructed.

Figure 3. The 3D reconstruction of the Rubic’s cube at different views.

From the reconstruction obtained, it can be found that stereometry was able to recreate an approximate 3D representation of the object.  Though the results showed satisfactory output in terms of its surface, the corner of the cube was not fully defined.  This can be accounted on the minor inaccuracies in distances during recording and human error in picking the points used in calculations.

Lastly, we would like to thank Dr. Soriano the assistance and for lending us the camera and materials used in this activity.
—————————————————————————————————————

References:
[1] Soriano, 2010. Stereometry. Applied Physics 187.
[2] Marshall, 1997. Introduction to Stereo Imaging — Theory.
[3] MathWorks. Tri-scattered Interpolation

h1

10. Video Processing

October 12, 2010

Videos are series of images displayed swiftly such that they appear continuous to the human eye.  These images are flipped at a certain speed expressed in frames per second (fps).  If a video runs at a rate 30fps, each image is displayed for 1/30 sec.

Having knowledge that videos are composed of images, we can apply image processing techniques to process and gather information from video clips.

Kinematics video

We recorded a video of a kinematic experiment on which we set up an Atwood’s machine, composed of a pulley and hook weights of mass equivalent to 400g (gray) and 500g (blue).

Notice that we captured the video using a checkerboard as background (with square dimensions 1in x 1in).  By this, we aim to measure the acceleration of the system by tracking the position of the blue hook weight against time.

Preparing the clip and extracting frames

After obtaining the AVI file of the video, we used the software VirtualDub to extract the significant portion of the video as a clip.  Furthermore, to rotate the video into its proper orientation, we used the tool Avidemux to put  the clip in its upright position.

We then extracted image sequences of the video at a 210 frames per second.  Figure 1 shows some of the clips from the 162 images we have extracted as files.

Frame 0 Frame 32 Frame 64
Frame 96 Frame 128 Frame 160
Figure 1. Samples of the images extracted from the video as frames.

Image color segmentation and object location

We specially chose the single-colored hook weight (blue), which characteristically stands out from the background, as the object where to monitor the speed of the system. We used image color segmentation (non-parametric) to locate the object in the image.  Figure 2 shows the region of interest (ROI) detected from the frames.

Frame 0 Frame 32 Frame 64
Frame 96 Frame 128 Frame 160
Figure 2. The blue hook weight (region of interest) detected via non-parametric color segmentation.

After this, we performed morphological operations (opening and closing) to clean the image from unwanted noise and to come up with better representations of the hook weight in the images.  We then calculated for its center of mass (of a single-pixel size), which will serve as the basis for the measurements.

The following video shows the color segmented and morphologically processed frames.

This next video shows the weight detected and located.

Determining the system acceleration

From the pixel locations obtained as the centroid for each image (162 in total), we recorded the position of the object against time (Figure 3).

 

Figure 3. Position (m) vs. time (s) curve obtained from measurements in the video clip.

 

We then determined the acceleration of the system by calculating for the second derivative dx2/dt2 of the data series.  Our calculations yielded an acceleration, a = 0.9708 m/s2.

Comparison with analytic calculations

Assuming a frictionless pulley and a string of negligible mass:

From the ideal case where m1 = 0.400kg and m2 = 0.500kg, the acceleration a = 1.0888 m/s2.  Our value obtained from calculations using the video is 10.84% error from this value.

Using actual weights, on the other hand, m1 = 0.3972 and m2 = 0.500, the acceleration a = 1.1248 m/s2, where our calculated value is at 13.69% error.

These deviations can be accounted on (1) the kinematics setup where the friction of the pulley introduces slight difference from the theoretical calculations; and (2) the series of image processing techniques applied to the frames, which includes the color segmentation, morphological operations and determination of the center of mass.

In these, we have shown that image processing can be applied on data gathering and processing, not only in pictures but also in videos.  By integrating basic processing techniques, we were able to detect and process object changes or movements through space and time.

We would like to thank the UP NIP for allowing us to use the equipment needed in this experiment, and Dr. Soriano (and the VIP team) for the assistance and for letting us use the high-speed camera and checkerboard.

————————————————————————————————————
References:
[1] Soriano, 2010. Video Processing. Applied Physics 187.
[2] Hyperphysics, 2005. Atwood’s Machine.
[3] Wikipedia, 2010. Atwood Machine.

—–
h1

3. Familiarization with Light-Matter Interaction

September 28, 2010

Objects and figures appear the way they do because of the optical interaction that occurs when light strikes on them.  The light interacting with an object could depend on its composition, material and texture.

We took a short walk around the NIP complex to find examples of light-matter interaction. In a true Matanglawin fashion, we squatted over puddles, observed butterflies and categorized every little thing we saw.

Reflection

Specular reflection refers to mirror-like reflection on object surfaces.  This follows the law of reflection where the angle of incident rays from the normal is equivalent to the angle of reflected rays.  This is observed in smooth surfaces like mirrors. It also gives the glossy appearance of many objects.

Here are some examples of specular reflection (Figures 1 to 4):

Specular (glossy)

Figure 1. Specular reflection from the hood and windows of a Hyundai. Notice how the sun is clearly reflected from the windshield.

_

Figure 2. Trees are reflected from the windows of a BMW.

_

Figure 3. Inside a classroom, arm chairs reflect the exterior of the NIP Research Wing (left). At a different angle, the smooth surface becomes a mirror for Tisza’s hand (right).

_

Figure 4. Squatting over puddles in the NIP Parking Lot, we find specular reflection on the smooth surface of water.

_
Body (matte)

Diffuse reflection, on the other hand, is observed from rough or matte surfaces.  Light is scattered in many directions or diffused.  Figure 5 displays a few examples of diffuse reflection.

Figure 5. Matte surfaces reflect light at various directions in contrast to specular reflection. This makes it possible for visual inspection to give you an idea of the floor-tile textures without feeling them.

_
Interreflection
Reciprocal reflection between two or more surfaces can sometimes occur.  The image below shows an example of this.

Figure 6. Interreflection (and transmission) within a glass block.

_

Transmittance

Some substances only allow a fraction (or selected wavelengths) of the light to pass through.

Figure 7. Machine shop’s very own welding mask (left) and the NIP Lecture Hall seen through it (right). The mask protects the eye from UV rays and from harmful gases while welding

_

Figure 8. Beverage that allows the transmission of yellow light.

_
Interference and Diffraction

Diffraction is the spread of light around edges or boundaries.  Interference, on the other hand, is the overlap of electromagnetic waves as defined by their superposition.  Resultant waves could be out of constructive or destructive interference.

Figure 9. Interference and diffraction of sunlight entering the building hallway.

_

Figure 10. Interference and diffraction of light from the ceiling of the NIP elevator.

_
Other optical properties

Thin-film Interference and Refraction

Light bouncing of a thin film can form colorful patterns due to interference, such as in bubbles.  In prisms, different wavelengths of light are transmitted at different angles in effect from the substance’s index of refraction.

Figure 11. Thin film interference with soap (left) and refraction from a prism.

_
Partial reflection and partial transmittance

On some substances, a fraction of the light is transmitted while the rest are reflected back.

Figure 12. Dennis, Theiss and Gladys partially reflected and the tiled floors transmitted through a glass door. The smooth surface allows the transparent door to reflect the NIP outdoors while providing a view of the NIP Faculty Wing interiors

_

Different colored objects and measurement of their transmittance / reflectance / absorbance

Light interacts with matter through three processes: reflection, transmission and absorption.

<strong>Reflectance</strong>
<table width=”390″>
<tbody>
<tr>
<td><a href=”https://colorscience.files.wordpress.com/2010/08/fan_green_reflectance1.png”><img class=”alignnone size-medium wp-image-98″ title=”fan_green_reflectance” src=”https://colorscience.files.wordpress.com/2010/08/fan_green_reflectance1.png?w=300&#8243; alt=”” width=”130″ /></a></td>
<td><a href=”https://colorscience.files.wordpress.com/2010/08/fan_blue_reflectance.png”><img class=”alignnone size-medium wp-image-99″ title=”fan_blue_reflectance” src=”https://colorscience.files.wordpress.com/2010/08/fan_blue_reflectance.png?w=300&#8243; alt=”” width=”130″ /></a></td>
<td><a href=”https://colorscience.files.wordpress.com/2010/08/fan_magenta_reflectance.png”><img class=”alignnone size-medium wp-image-100″ title=”fan_magenta_reflectance” src=”https://colorscience.files.wordpress.com/2010/08/fan_magenta_reflectance.png?w=300&#8243; alt=”” width=”130″ /></a></td>
</tr>
<tr>
<td colspan=”3″><strong>Figure 1.</strong> Reflectance of a colored cloth fan. Spectra from left to right correspond to green, blue and magenta.</td>
</tr>
</tbody>
</table>
text
<table width=”380″>
<tbody>
<tr>
<td><a href=”https://colorscience.files.wordpress.com/2010/08/umbrella_reflectance.png”><img class=”alignnone size-full wp-image-206″ title=”umbrella_reflectance” src=”https://colorscience.files.wordpress.com/2010/08/umbrella_reflectance.png&#8221; alt=”” width=”190″ /></a></td>
<td><a href=”https://colorscience.files.wordpress.com/2010/08/laptop_red_reflection.png”><img class=”alignnone size-medium wp-image-103″ title=”laptop_red_reflection” src=”https://colorscience.files.wordpress.com/2010/08/laptop_red_reflection.png?w=300&#8243; alt=”” width=”190″ /></a></td>
</tr>
<tr>
<td colspan=”3″><strong>Figure 2. </strong>Reflectance spectra of an umbrella colored black (left) and a laptop’s surface colored red (right).</td>
</tr>
</tbody>
</table>
<table width=”380″>
<tbody>
<tr>
<td><a href=”https://colorscience.files.wordpress.com/2010/08/laptopcover_reflectance.png”><img class=”alignnone size-medium wp-image-102″ title=”laptopcover_reflectance” src=”https://colorscience.files.wordpress.com/2010/08/laptopcover_reflectance.png?w=300&#8243; alt=”” width=”190″ /></a></td>
<td><a href=”https://colorscience.files.wordpress.com/2010/08/notebook_blue_reflection.png”><img class=”alignnone size-full wp-image-207″ title=”notebook_blue_reflection” src=”https://colorscience.files.wordpress.com/2010/08/notebook_blue_reflection.png&#8221; alt=”” width=”190″ /></a></td>
</tr>
<tr>
<td colspan=”3″><strong>Figure 3. </strong>Reflectance spectra of a laptop sleeve colored black (left) and a writing notebook colored red (right).</td>
</tr>
</tbody>
</table>
text
<strong>Transmittance</strong>

text
<table width=”390″>
<tbody>
<tr>
<td><a href=”https://colorscience.files.wordpress.com/2010/08/fan_green_transmittance.png”><img class=”alignnone size-medium wp-image-105″ title=”fan_green_transmittance” src=”https://colorscience.files.wordpress.com/2010/08/fan_green_transmittance.png?w=300&#8243; alt=”” width=”130″ /></a></td>
<td><a href=”https://colorscience.files.wordpress.com/2010/08/fan_blue_transmittance.png”><img class=”alignnone size-medium wp-image-106″ title=”fan_blue_transmittance” src=”https://colorscience.files.wordpress.com/2010/08/fan_blue_transmittance.png?w=300&#8243; alt=”” width=”130″ /></a></td>
<td><a href=”https://colorscience.files.wordpress.com/2010/08/fan_magenta_transmittance.png”><img class=”alignnone size-medium wp-image-107″ title=”fan_magenta_transmittance” src=”https://colorscience.files.wordpress.com/2010/08/fan_magenta_transmittance.png?w=300&#8243; alt=”” width=”130″ /></a></td>
</tr>
<tr>
<td colspan=”3″><strong>Figure 3. </strong>Transmittance of a colored cloth fan. Spectra from left to right correspond to green, blue and magenta.</td>
</tr>
</tbody>
</table>
text
<table width=”390″>
<tbody>
<tr>
<td><a href=”https://colorscience.files.wordpress.com/2010/08/plasticcover_transmittance.png”><img class=”alignnone size-medium wp-image-108″ title=”plasticcover_transmittance” src=”https://colorscience.files.wordpress.com/2010/08/plasticcover_transmittance.png?w=300&#8243; alt=”” width=”130″ /></a></td>
<td><a href=”https://colorscience.files.wordpress.com/2010/08/tumbler_transmittance.png”><img class=”alignnone size-medium wp-image-109″ title=”tumbler_transmittance” src=”https://colorscience.files.wordpress.com/2010/08/tumbler_transmittance.png?w=300&#8243; alt=”” width=”130″ /></a></td>
<td><a href=”https://colorscience.files.wordpress.com/2010/08/windowglass_transmittance.png”><img class=”alignnone size-medium wp-image-110″ title=”windowglass_transmittance” src=”https://colorscience.files.wordpress.com/2010/08/windowglass_transmittance.png?w=300&#8243; alt=”” width=”130″ /></a></td>
</tr>
<tr>
<td colspan=”3″><strong>Figure 4. </strong>Transmittance spectra. Left to right: a clear plastic cover, a green water tumbler, and glass window.</td>
</tr>
</tbody>
</table>

Light interacts with matter through three processes: reflection, transmission and absorption.  We collected different objects and measured the extent to how light interacts with them.

Reflectance

We gathered different objects and measured the reflectance on their surfaces.  We divided the measurements with the values measured from the light source and obtained these results.

Figure 13. Reflectance of a colored cloth fan. Spectra from left to right correspond to green, blue and magenta.

_

Figure 14. Reflectance spectra of an umbrella colored black (left) and a laptop’s surface colored red (right).

_

Figure 15. Reflectance spectra of a laptop sleeve colored black (left) and a writing notebook colored red (right).

_
Transmittance

These objects were measured for their transmittances.

Figure 16. Transmittance of a colored cloth fan. Spectra from left to right correspond to green, blue and magenta.

_

Figure 17. Transmittance spectra. Left to right: a clear plastic cover, a green water tumbler, and glass window.
h1

8. Camera Calibration

September 18, 2010

*performed on 15 September 2010.

Taking a 2D image of a 3D real-world object, we transform 3D real-world coordinates to 2D image coordinates. These 3D coordinates can be reconstructed in reverse by performing a camera calibration.

A calibration board with uniform black and white squares (also known as Tsai grid)  is imaged. Each square corner indicates a real-world coordinate. These in turn will have a pixel-position in its image.

Relating real-world and image coordinates, we can obtain camera parameters that can be used for reverse reconstruction of 3D coordinates.

We first obtained the camera parameters a by choosing 11 real-world points and and their corresponding image points. Then we input these into the equation:

Qa = p

where  a is a matrix of camera parameters, p are the image points and Q is a matrix having a form that is shown in Eq. 31 of the manual. a can be solved by:

a = (QT * Q)-1 * QT * p

where QT is the matrix transpose of Q and -1 indicates a matrix inverse.

We chose 11 real-world points and their image coordinates and using these we obtained a to be:

[-27.8, -45.1, -129.8, 1946.2, -151.0, -11.0, -4.8, 1972.3, -0.023, -0.040, -0.0038]

The chosen points for calibration are a mix of points that are on the corners and edges of the tsai grid and some points near the origin (0, 0, 0).

Next we verify the calibration by picking real-world coordinates and predicting their respective image coordinates.

6 points were chosen and their image coordinates were calculated using Eq. 29 and 30 in the manual (explicit solutions of xi and yi)… and the result is:

(click to zoom)

The magenta dots are the points used for camera calibration and the green dots are the predicted image coordinates. The predicted image coordinates lie very close to the selected corners and are only a few pixels off. This shows that the prediction is quite accurate and that the calibration was done correctly.

h1

5. Measuring the Gamut of Color Displays and Prints

August 13, 2010

Gamut chromaticity coordinates calculated from data collected for different color hardware:

Acer OAC CRT HP Printer Epson Projector
x y x y x y x y
0.46936 0.30732 0.52667 0.37543 0.36163 0.43258 0.59044 0.34712
0.35031 0.57938 0.28870 0.60720 0.33676 0.33430 0.37779 0.58638
0.17192 0.16895 0.13343 0.11898 0.26937 0.35870 0.14294 0.07478
0.46936 0.30732 0.52667 0.37543 0.36163 0.43258 0.59044 0.34712

Projected on Figure 1 are the chromatic coordinates on the CIE xy tongue.

Figure 1.Visualization of the chromaticity coordinates superimposed on the CIE xy tongue.
h1

2. Familiarization with Properties of Light Sources

August 7, 2010

This activity aims to familiarize us with different light sources – their emission spectra, and how they emit light.  The emittance spectrum of a blackbody radiator will also be computed for various temperatures.

Emittance spectrum of different light sources

The spectra of different light sources were measured using a THORLabs spectroradiometer.  Here we used a fluorescent lamp, LED flashlight, green, red and blue LCD screens and the same colors using an LCD projector.  We aimed to compare the emittance of the different sources by comparing their spectra at the primary light colors shown in Figure 1.

Figure 1. Primary colors of light. Left to right: red, green and blue.

LCD Projector

The light source of an LCD projector is a tungsten-halogen lamp. It is an incandescent lamp with a tungsten filament that can withstand high temperatures. Also, the bulb is filled with an inert gas and some halogen elements like bromine or iodine which interacts with the evaporated tungsten and re-deposits it into the filament, thus increasing the lifespan of the lamp.

The measured emittance spectra for LCD projector is shown in Figure 2.

Figure 2. Emittance spectrum of an LCD projector.  Left to right: spectrum for red, green and blue. (Click individual images for a larger view.)

It can be observed that the peaks for specific color spectra correspond to the wavelengths of the corresponding colors.

LCD Monitor

The LCD monitors in laptops are illuminated by two fluorescent lamps positioned vertically on the two sides. By applying electric current, the liquid crystals change the polarity of light as well as the amount of light transmitting through it.

We measured the emittance of the LCD monitor displaying the primary light colors and obtained the spectra shown in Figure 3.

Figure 3. Emittance spectrum of an LCD monitor. Left to right: spectrum for red, green and blue. (Click individual images for a larger view.)

Fluorescent lamp and LED light

The fluorescent lamp works by zapping mercury vapor in a closed tube with electricity. This excites the mercury atoms and they emit UV light. This UV light then hits the phosphor coating which fluoresces and emits visible light.

We can see in Figure 4 the emittance spectra of a fluorescent lamp with its emittance spectra taken from the internet.  The purple line is the one taken from the web.

Figure 4. Emittance spectrum of fluorescent lamp taken experimentally (blue line) and from the web (purple line).

The two curves don’t really fit that well, but we can see they have a very sharp peak at the bluish regime and also at the yellow-green regime. Also, they both have a broad spectrum at the yellow-orange region.

LEDs or Light-Emitting Diodes are semiconductor devices that emit light using electro-luminescence. The electrons recombine with the holes and produces photons. The color of the LED depends on the energy gap of the semiconductor.

We can see in Figure 5. that there’s a very good agreement in the actual and literature spectra for LED flashlight, although they seem to be slightly shifted.

Figure 5. Emittance spectrum of white LED taken experimentally (blue line) and from the web (purple line).

White LEDs are usually made of blue/violet LEDs with yellow phosphor.  This accounts for the characteristic spectra obtained in the experiment.

Since the LEDs are dependent on the band-gap energy of the diode, emission is monochromatic and white can only be produced in combination with other means of producing light. Hence the multiple peaks for the LED.  As it appears, we have a sharp peak around the blue-violet region (380-450) and peaks broadening around the yellow region (570-590 nm).

Planck’s Blackbody Radiation

Planck’s Blackbody radiation formula in the spectral energy density form is expressed as:

where h is the Planck’s constant, c the speed of light, T the temperature in Kelvins.

We calculated spectra for temperatures 1000K, 2500K, 5400K and 6500K and obtained the plots shown in Figure 6.

Figure 6. Planckian emittance spectra at temperatures T = 1000K, 2500K, 5400K and 6500K.

These T parameters correspond to the usual color temperatures for the HID bulb, LED, fluorescent lamp (daylight) and color monitor, respectively.

From these diagrams we can see what spectral diagrams to expect based on the specific color temperatures of our light sources.

Correlated Color Temperature (CCT)

The CCT is an important characteristic of visible light with significant applications in various fields.  A light source’s color temperature is related to Planck’s Law and corresponds to the temperature of the ideal black-body that radiates light to the source at comparable hue.  Following are examples of CCT labels found on common light bulbs.

Figure 7. CCT labels on different commercial light bulbs.

Knowing the characteristic emittance of light sources and their relation to temperature has already led to various applications such as photography, lighting and artistic applications.  However, acquiring more accurate measurements will need fairly more expensive equipment.

We acknowledge Dr. Soriano for allowing us to use the apparatuses needed to accomplish these experiments.

——————————————————————————————————————

References:
[1] Soriano, 2010. Familiarization with Properties of Light Sources. Applied Physics 187.
[2] Wikipedia, 2010. Halogen lamp.
[3] Wikipedia, 2010. Light-emitting diode
[4] Tyson, 2010. Creating anLCD. How stuff works.
[5] Wikipedia, 2010. Color Temperature.

h1

1. Sensing Properties of the Human Eye

July 23, 2010

In studying human vision, it is important to first learn the sensory limits of the eye.  Here we perform a series of experiments that explore and determine the basic defining characteristics of the eye, including its optimal distances and limitations of the human visual system.

In the following experiments, we used the following materials: meter stick or tape measure, ruler, calculator, colored paper and black bags.

Minimum focus distance

For this part, we will measure the minimum focus distance of our eyes. For each eye, we hold a pen in front of it and move it closer and closer until it the eye can no longer focus on it. We then measure this distance and this is what we’re looking for (Figure 1).

Figure 1. Measurement of the minimum focus of the eye.

The results for the three of us are:

Tisza Gladys Dennis
left eye 20 cm 12.4 cm 11.2 cm
right eye 18.5 cm 13.1 cm 11.8 cm

Generally, each eye has a different minimum focus distance. Dennis has the shortest focus distance followed by Gladys and then Tisza. Also note that Dennis is near-sighted which may be the reason why his minimum focus distance is the shortest.

Also, the minimum focus distance varies across ages.  Children have the shortest minimum focus distance and as people age, the min. focus distance increases. This is because the lens inside our eyes becomes less and less elastic as we age, so its accommodation will be less.

Maximum angle of peripheral vision

The human eye, though focusing on a certain object or area, is still able to perceive things around it partially.  In this section, we try to measure the largest angle on which the peripheral vision is effective.  We let the subject fixate on a point on a wall at an eye-level point.  A pen is then moved from the center away to the right until the he can no longer see it (Figure 2).  The distance of the pen from the center, as well as the distance of subject’s eyes from the wall, are recorded.  The same procedures are done towards the left, downward and upward directions.

Figure 2. Measurement of the maximum peripheral distance of both eyes.

The data recorded for the three test subjects are as follows:

Tisza Gladys Dennis
left 52.8˚ 59.7˚ 57.9˚
right 50.2˚ 55.6˚ 57.3˚
up 31.8˚ 34.8˚ 42.3˚
down 42.3˚ 45.6˚ 53.7˚

Results show fairly similar values of at least 50 degrees of peripheral vision to the left and right, with small variation among the subjects.   However, a significant difference was observed for vertical periphery.  It appears that Dennis is able to see a wider range for upward and downward vision.  It is recommended that this experiment be performed on more test subjects in order to draw conclusions regarding what factors contribute to the significant differences – age, gender, etc. Also, there appears to be a significant difference between side-to-side angles compared to up-down angles; it appears we can see more horizontally than vertically. It is also interesting to note that in all three test subjects, the maximum angle for the downward peripheral vision is greater than upward peripheral vision.

Visual actuity

Here we find out the maximum angle in which one is able to focus through the fovea – an area in the eye with high concentrations of photoreceptors that enables to see small details, but within a small range.  We produced on paper a series of alphanumeric characters randomly arranged in several lines (font size 9 – 12pts).  A subject is made to stand at about 25-50m away from the text posted on the wall and is asked whether the next uncovered character is in focus.  A measurement was made from the fixation point to the last letter that remained in focus.  The procedure was done by uncovering the letters to one side first then doing it again for the other direction.

The following table summarizes the data for the three subjects:

Tisza Gladys Dennis
left 1.02˚ 0.99˚ 0.61˚
right 1.02˚ 0.99˚ 0.82˚

Results show fairly consistent results for both eye directions with a little variation in Dennis’ data. However, it can be noticed that Tisza and Gladys displayed higher visual acuity compared to Dennis.  Again, this area should be investigated regarding what factors might have led to these results.

Scotopic and Photopic Vision

Scotopic vision is referred to as vision under low-light conditions, while photophic vision is under well-lit conditions.  The human eye under varying lighting conditions also display certain extent of sensitivity to selected colors.  Here we investigate on which colors are first observed or identified by the eye at the dimmest light level and upon gradual increase of light.

We gathered strips of color paper and randomly lined them up inside a black bag where almost no light can get through (Figure 3).  The subject is made to look into the bag while slowly letting in some light into the bag.  He/She is then asked to tell the colors he saw according to the order he/she perceived them.

Figure 3. The black bag and the colored strips used for investigating scotopic and photopic vision.

The following is the order colors perceived by the subjects:

Tisza Gladys Dennis
1 yellow yellow orange
2 blue green yellow
3 orange orange green
4 red indigo blue
5 green blue violet
6 violet violet indigo
7 indigo red red

During the experiment, some observations were gathered from the subjects regarding the color perception under limited light: (1) seeing multiple yellows, and seeing orange but later realizing it’s red.  This is attributed on the idea the eye is first sensitive to the color yellow, which causes the misinterpretation of some colors, though only a part of the color’s hue was perceived.

Exploration – Blind Spot

The blind spot is an area on the human eye where there is a fewer number of photoreceptors.  This is said to be the location on which the optic nerves pass through.  This is not noticeable when both eyes are open since the lack of detail on one eyes is compensated with the other.  It is possible to determine the location and width of the blind spot by a simple test.
We created, on a piece of paper, two figures – a cross and a circle – of 7cm distance from each other.  To locate the blind spot on the right eye, the subject was made to cover his left eye and focus the right one on the cross (Figure 4).  He will then have to adjust is eye’s distance from focus on the cross to a point where he can no longer see the circle on the right.  A symmetric procedure is done on the left eye.

Figure 4. Measurement of the blind spot.

The following table shows the range of angles of the blind spot from the focus of the eye:

Tisza Gladys Dennis
left eye 12.3˚ – 17.6˚ 13.1˚ – 16.9˚ 11.6˚ – 15.6˚
right eye 14.3˚ – 16.3˚ 16.3˚ – 17.6˚ 12.0˚ – 16.9˚

The blind spot of the right eye is to the right and the blind spot of the left eye is to its left. The values of the range of angles for each subject is close with one another. However, it is interesting that for Gladys’ right eye, the blind spot seems to be smaller.

Being familiar with the optimal parameters (and limitations) of the eye, we are able to better understand how the human visual system works. By this, if we are to create artificial systems that cater to the eyes, we are already aware of the factors that affect human vision, and thus effectively reflect or even imitate them on artificial computer visions.

——————————————————————————————————-
References:
[1] Soriano, 2010. Sensing properties of the human eye. Applied Physics 187.
[2] Deering, 1998. The limits of human vision. Sun Microsystems.

˚