FUTURE OF THE SPACE COMMUNICATION

This paper is a research on the future of satellites in our world today. Since decades ago the awareness on satellites has grown widely to a point that it is becoming a necessity for a country to have one. The existence of satellites began in the late 1950, when the first satellite was launched into space by the Soviet Union.

Satellites are known to have increased our efficiency and performance in the business world, Military attacks, Global Networks. It is so hard to imagine where our world would be today without the existence of satellites. This paper discusses how every satellite has a range of frequency it works with and the transmit power required to reach the ground base stations placed on the earth surface. It mentions a number of equipments that make use of the space communication device whish are Spaceships, air flight, Submarines, GPS navigations, Mobile phones and the internet servers.
There are major bodies in every country responsible for the space communication devices. For the US we have NASA which handles the building and launching of satellites into space.

It discusses how the world is now becoming over reliant on telecommunication in its daily activities. The direction of this trend seems to be on the increase thereby creating a cause for alarm that there might come a time that the space will become over saturated with satellites.

We took a step back in time to review how the whole concept of space communication began with the military bouncing radio signals on the moon to see if they will get a response and they did. Further research on space communication took over from there and the result is what we enjoy today.
We analyzed the mode of operations of satellites and the various types that we have based on the functions. The various classifications touched are the Navigation satellites, Military satellites, Communication satellites, Scientists and Weather satellites. Discussing further on each the research work also focused on the advancement of technology into the construction of Pico satellites and nano satellites which will create a great solution to the challenges of encroachment and satellites collisions that could be experienced more often in the future ahead.

There are various satellites that serve different functions but their features are almost the same with only little disparity and the common ones include the antenna that serves as the receiver and transmitter, transponder, camera of very high resolution, channel of specified bandwidth, solar panels and solar cells etc.

How long it would take to migrate into a world of the mini satellites remains unknown, but what is obvious is the major milestone that would have been overcome if it finally becomes a perfect reality. Even though some have been launched into space, the mini satellites are still at the testing stage which could take a long time to get to a satisfactory level.

Finally this essay throws more light on the challenges and the issues faced by the so called space communication. These challenges range from fuel deposition, space debris caused by collision of two or more satellites and many more.

Though these challenges could serve as a major setbacks, it is incomparable to what has been achieved through space communication and what more can still be achieved. Major steps are being looked into to see how these challenges can be overcome.

This research shed s more light to the many involvement of space communication and serves as an eye opener on the reasons why all countries should own a satellite. Can you imagine that the economy of a Nation can really be directly impacted by the launching of a satellite in space
No doubt other developing countries are beginning to see its advantages and are following suit. This is evident in the NigComSat-1 launched by Nigeria into space.

INTRODUCTION
Space communication has evolved into a vital aspect in communication today. The convenience derived from it has made many scientists focus on its development and how much more greater benefits can be acquired from its study. Space communication operates basically on a satellite placed somewhere high up in the sky interacting which other telecommunication devices all around the world. In this write up we shall be discussing the applications of space communication, the description of its technological aspect and the challenges and issues faced by scientist in its development or as population increases.

The research on space communications has led to the establishment of many bodies all around with the goal of tapping into the benefits of such technology. The satellites installed in space are one of the major essentialities to achieve space communications. In 1957, the Soviet Union launched the worlds first artificial satellite, Sputnik I (U.S. SATELLITE COMMUNICATIONS SYSTEMS). This has led to many nations installing their satellites in space. There is always a base station where major operations are made to communicate with these satellites in space. Each satellite has a range of frequency it works with and any telecommunication device that operates within such frequency of the satellite is considered interact able.

Equipments that make use of the so called space communication which is acquired mostly via satellites include Spaceships, air flight, Submarines, GPS navigations, Mobile phones and the internet. Satellites have been found to have increased the speed of communications. The advent of calling has also brought about more flexibility in mobile telecommunication which evident in the video calling. The video calling facilities is a unique one such that besides just transmitting and receiving voice data over the network you can also transmit videos. This means that through video calling you can actually see the person you are communicating with. This achievement was a far cry before the advent of space communication. The satellites used for space communications are responsible for such facilities called video calling and many other telecommunications.

As earlier stated every satellite owned by each country has a specifically allocated frequency range and also have a range of distance apart they should be place, else what is called interference of frequency, thereby causing distortion in signals. Who therefore is saddled with the responsible of the design of these satellites in the United States That is where the National Aeronautic and Space Administration (NASA) come into play. Since the various satellites installed in space send out and receives signals at very high frequencies per seconds to the base stations set up on the earth at any location it has become essential for there to be an organization that will be in charge of allocating these signal frequencies to avoid overlapping that could lead to poor signals.

Before September 15, 2003 the regulatory framework for satellite communications was based on decision 46700CONS of Autorit per le Garanzie nelle Comunicazioni (AGCOM) (Giovanni Santella, Roberto De Martino).

The world today is now becoming over reliant on space communication that you dont even want to imagine how much in dollars would be lost if signal in space is lost just for few second. The operations of military men in war, ongoing business transactions will collapse totally. Or imagine an airplane flight losing signals from the airport tower, thats like a blind man leading the blinds. Therefore, for optimum efficiency of space communication it is important not to allow any form of interruption or signal loss in space communication via satellites.

Though, the research into space communications and other technology breakthroughs were commercially driven but they are still very much beneficial in the military operations (JDCB)
Space communication is a further leap in technology and it has is greater disadvantages and advantages. Though the development has been view from the positive point of view, many deadly weapons have been functional since the advent of space technologies.

Besides the negativity, the advent of space communication has brought about the connectivity of the globe. The world seems to be a smaller place to live. Business transactions that could have required hours to be executed can now be performed within seconds. You dont need to take that next flight from Texas to Chicago just to deliver a mere letter. The email, Fax or telegraph has simply replaced that.

HOW DID IT ALL START
The U.S. Navy began conducting experiments in 1954 bouncing radio signals off of the moon. These experiments led to the worlds first operational space communications sys-tem, called Communication by Moon Relay (U.S. SATELLITE COMMUNICATIONS SYSTEMS).

The radio signals were generated from their base stations and what they realized was that they were first getting feedbacks from the moon thereby serving as a transmitter of signals. Further researches on this new found experiment were channeled into the linkage between two or more countries using such technology. Billions of dollars are being spent to build this great edifice and install it in space. America first launched the Echo 1 through NASA and later came up with the Echo II some years later.

A satellite is a well organized device that has distinguished paths for every signal it sends and receives from the base stations. Satellites are found to make use of the ku  ka -band which is responsible for the space communication we experience in our world today. The ku-band is characterized by its diversity systems via Satellite Diversity unit, which avoids the problem of attenuation of signals caused by bad weather or distance. It has a multi-satellite reception ability such that if a particular signal pathway is bad it makes use of several others (T. Hatsuda, Y. Iwamori,M. Sasaki, Y. Tushima, N. Yosimura, T. Zakouji, K. Kawasaki, T. Yamashita, M. Kuroda, K. Imai, Y. Maekawa).

It also has what is called repeater which maintains the strength of a sent or received signal over a long range of distance thereby avoiding what is called signal attenuation.

Many countries have installed their satellites in space and the number keeps increasing. Nigeria which is one of the developing nation installed one of the most recent satellites in space, called the NigComSat-1. Other satellites installed by other nations include Apollo 11, 14 and 15 deployed by Americans and the Luna which was co-built by Russia and France, Beacon-C etc.

HOW DOES THIS WORK
Every satellite has a corresponding frequency range, Bandwidth and Transmission power as its design specification. Every country configures their devices at this base station to the specifications of their satellites. Therefore such devices can send radio signals that will arrive at the satellite in space no matter where they are located in the world, so far they send them at the specified frequency. Any radio signal sent to space is in form of a packet, each having its specified transmission power, frequency range, and wavelength associated with it. Such radio signal or packet from the base station goes into space and begin its search for satellites that are compatible with it. Other technological interactions occur between the various satellites installed in space in order to detect which one the base station is trying to interact with. After receiving the signal sent by the device on earth the satellite analysis the data packet, which has destination addresses written on it. These addresses are used by the satellite to search for the specific location of the devices meant to receive the data packet. Having recognized the destination the satellite transmits the radio signal it has received to the destinations. The destination device which could either be a telecommunication base stations, VSat, Satellite dishes etc. receives the data and also sends a form of acknowledgement to the satellite in space that it has received the data sent. That obviously tells us that the devices on earth too must definitely have a receiver and a transmitter to enable space communication. Canada has a long history of activity in outer space. In 1962, Canada became the third country in the world to design and build its own satellite, when it launched the Alouette I research satellite (Wright).

Communication through space is a two-way path process. The satellite in space and every device on earth have transmitters and receivers. While the transmitter serves as the way-out for any form of signals the receiver serves as the way-in.

A space satellite must have a characteristic large bandwidth to enable access to many devices at the same time. It is hard to imagine the billions of data passing through the each satellite per second. Without a large bandwidth the satellite in space soon easily reaches its full capacity thereby excessive data will be dropped and poor signals will be received or a total loss of radio signals. This is evident in the scrambling of visual or voice data that is always noticeable.

Every satellite moves in circular orbit around the fixed earth in space generating specific signals to the base stations receivers on the earth surface. The exact location of a person on the earth can be known from the information on every satellite in orbit. Besides the location of a person on earth, satellites can also be used to probe the positions of submarines, air flights, space ships, army base camps etc. This implies one of the most important aspects of satellite used in space communication.
There are various satellites that are now in existence having specific functions they perform, even though they all have the ability to communicate through their receivers and transmitters. The different types are Communication satellites, Navigation satellites, Weather satellites, Military satellites, Scientist satellites. More details on their functionalities will be discussed in the next sub-heading.
Each satellite is built with its special in mind of the engineers. Let us take a brief look at each of the satellites.

Communication satellites are mainly responsible for all communications involving data, voice and television transmission from earth to the outer space and to the earth. The communication satellite is like a communicator of two or more far distances together since it has a stronger radio signal strength that can reach each device.

Navigation satellites find their applications in the use of tracking down locations using the GPS (Global Positioning System). How it does that is through the principle of relativity approach collecting details of signal received from other nearby satellites. The GPS technology can be used to view the maps of entire continent. Sometimes, when you are in a city, and you take a look at the screen of your phone, you can see the exact location where you are and you begin to wonder how your service provider got to know about it. That was achieved via the GPS technology.

Weather satellites as the name sound are mostly use by the geographers to study the weather. A lot of technology signals are involved in this process which we will be taking a deeper look into in due time. Weather forecast would be impossible without the existence of this type of satellites.
Military satellites are very complex type to discuss due to it many functions as they seem to be a combination of various properties from other types of satellites. For instance military satellites are used to communicate from the military base to the soldiers at the war zone. They are also used to control the direction of missiles via GPS.

Scientific satellites are use by the astrologers to study the space, their activities and movements. The patterns of the clouds are viewed from it and also the study of other planets can be achieved using the scientific satellites. Distance, shapes, sizes etc of planetary objects can be known with the use of this kind of satellites.

Some of these varying satellites have similar facilities which include cameras, transmitters and receivers, channels, etc and they all use similar signals which is the radio signals.

APPLICATIONS
The concept of space communication and technology cannot be handled separately from the discussion on satellites and their basic functionality and operational module. They go hand in hand. We shall be elaborating on the various applications of space communication in our world today and how it helps us interact with our neighboring planets in outer space.

It is not an overstatement that the use of satellites in space communication cannot be overemphasized. A lot has been made far easier with its utilization world wide. Places we never imagined we could have contact with can now be communicated with within few seconds. Researches and information that were never achievable are now possible and many more.

COMMUNICATION SATELLITES
Basically used for communication between two or more devices, communication satellite has become essential in our day-to-day business activities and casual conversation, making the global place smaller than it really is. The radio waves of high strength and frequency is the media by which most of the signals are sent and received. The signals used in communication satellites are in three types, namely data, video and voice. Unlike the case of military satellites, these signals are not encrypted therefore the level of security is very low. Communication satellites have its direct applications in Satellite cables, cell phones, emailing and internet, fax and telegraphing. Satellites only operate on the downlink, sending information and data to all receivers on earth while it receives signals by means of uplink from the devices from earth. It is used by pilot in the air to contact the airport for direction in space. Without the satellite it would be difficult for two cell phones or telecommunication device to connect with each other. Even if you put them next to each other since they use the satellite in space as an intermediary to connect. It is very important to note that the communication satellite was designed in such a way that it shall be powered continually. After considering what to use scientist decided to use the most reliable and easy to maintain source of energy which is the solar energy. The properties of communication satellites therefore include the solar panel which is required to absorb sunrays used to charge the solar cells which are responsible for keeping the satellites on constantly. Other properties include the antenna which serves as the receiver and transmitter which is common to all satellites.

The devices that make use of the communication satellite are many which are our usual mobile phones which has its own antenna too used to receive and send signals, television, satellite cables, and internet servers. Over the years the dependent on communication satellites has grown so much that many businesses thrive on them. It has its many advantages in the sense that it brings with it business transaction at your finger tips. A lot of convenience has been achieved from this invention. A lot of companies run on the internet with no physical office. It is therefore logical to wonder what will happen if there comes a kind of natural disaster in space that destroys all the satellites or simply interrupts the signals. Since there is a form overdependence on space communication, many company do not have a single hardcopy documentation of their daily activities. Such disaster that interrupts the signals from space would definitely lead to economic breakdown, loss of invaluable data, and disconnection of airplanes from their control tower causing plane crash. That is why most satellites in space have backups in space serving as supplements incase there any fault occur in the original device.

NAVIGATION SATELLITES
The advent of navigation satellites has made it possible to track down the position of objects on earth. Communication satellites are limited in this facilities. If you are making a call or sending an email, there is no way you can know where the fellow is contacting. No tracking of position is possible with communication satellite. Besides just tracking positions, navigation satellites maps out a surrounding through the Global Positioning System- GPS services. As the captain of a ship in the sea you need to have a bigger picture of where you are heading to. It could be very confusing travelling in a ship in the ocean where there are no roads or paths that lead to you destination. It would be far easier to continue traveling in circles without knowing if you do not have any map to guide you. Only though the map can you know when to bend or continue moving straight. This map can be achieved by the GPS signals sent from the navigation satellites. Besides, missiles and space shuttle can be controlled by GPS signals. Though majorly they can be used to track down by fixing the chip in wristwatches cell phones

The properties of a navigation satellite is not much different an antenna (serving as receiver and transmitter), video camera of very high pixels, channels, solar panel and solar cells.

Navigation satellites find their applications in GPS signals used by submarines, street surveillance camera devices etc. With navigation satellites it is possible to be at a military camp base and launch a missile to another country located at miles away by just controlling the position and direction of the missile through what is called the remote sensing which is also related to this topic in a way. The disadvantage of navigation satellite is its use in military. With the use of navigation satellite is it easier to wipe out a whole nation through controlled missiles. This has brought about the manufacturing of mass destructive weapons by powerful countries

WEATHER SATELLITES
These types of satellites are basically used in weather forecasts. Forecasting of weather is done through the satellites in space by comparing results of the present weather from other satellites and observing the patterns of the cloud before sending a report to a ground base station on earth. The report on weather received is used to forecast the future weather. Many depend a lot on weather news in order to carry out their activities. The two main types of weather satellites are geostationary satellites and polar orbiting satellites. Their usefulness to weather forecast can therefore not be overemphasized.

The properties of weather satellites include also an antenna used as a receiver and transmitter, 24 channels to send out radio frequency signals, a hyper-sensitive detector device used to detect the nature of the weather and also a solar panelcells.

Airways depend on weather report to know how to schedule their flights arrangements. It can also find its usefulness in the schedule of other activities done in open space which range from rallies, crusades, festivals etc. Even in the developed countries individuals depend on this forecasted weather to know if the day will be sunny, rainy or snowy. This is to afford them make necessary preparations like taking along their umbrellas to work, wearing a lighter clothing.

The disadvantages are that you cannot rely so much on this reports because weather changes fast and could be erratic in nature. Many times people who depend solely on weather forecaster have found themselves to blame. We have heard news of airplanes which took off with the prerogative that there wouldnt be rain, only to later be high up in the sky in the middle of the storm.

MILITARY SATELLITES
Military satellites find their effectiveness mostly in the war zones. As I stated earlier that it is a complex form of satellites that combines many features of other satellites.
The Defense Satellite Communications System, or DSCS is a major subsystem in the global information grid, or GIG, providing high throughput, long-haul communications to Department of Defense and other special governmental users (Garth R. Hahn, Anthony Kellar  Steven Stubblefield).  From the same source it is known to consist of space, control and ground segment.
Military satellites are used to communicate between the soldiers in camp at the warfront and the defense headquarters. The major unique characteristic of the military satellites is the nature of the signals sent and received during communication. Due to obvious reason that radio signals sent through any satellite channel can be traced it was necessary for the defense department to come up with another option that would be more secure. In an attempt to make sure that their information is not leaked they decided to come up with the idea of scrambling the messages transferred over the satellite channel. A lot of other security measures are also taken to make sure that data transferred via military satellites remain intact.

Military satellites have a higher resolution camera used to send images.
The applications of military satellites that I know is simply in the defense systems.

SCIENTIST SATELLITES
Scientist satellites are used for scientific purposes. They are used to study the earth and its environs. Scientists have been able to get a better knowledge and understanding about space and planetary objects including the sun, moon, stars and other planets. The movement of oceanic waves can also be studied by analyzing the images sent by the scientific satellites. Through the study of the images sent from the scientific satellite in space one can retrieve the sizes, shapes and distances between these objects.

Similar to other forms of satellites, scientific satellites also have a high resolution camera which sends imaging signals on the patterns formed by objects in space.

TECHOLOGICAL
Satellite communications technology offers the unique capability of being able to simultaneously reach out to very large numbers, spread over large distances even in the most remote corners of the country. (A. Bhaskaranarayana, B. S. Bhatia, K. Bandyopadhyay and P. K. Jain)
Taking a brief look at space communication satellites from the technological aspect, we need to analyze the operation mechanics and procedures of the satellites from the optical, nano and Pico point of view.

OPTICAL SATELLITES
The sound of optical reminds you of imaging through satellites. As discussed earlier, most satellites make use of high resolution camera to capture images from space and send them to a ground base station receiving the data. Such images are used to further researches or draw conclusions about the space. The major aim was to send images of pixels that are not visible to the human eye. The optical satellites have a wide aperture used to capture images in space.
When using radio frequency telemetry link, the low-earth-orbit satellites transmitter could be detected by eavesdropped easily the optical downlinks through modulated laser signals provide a perfect solution because they can hold hundreds of megabits (Giggenbach).

The information data sent are undetectable.
Optical has a lot of innovative technology attached to it beside corrective measures against eavesdropping optical satellites transmit radio frequency in a similar fashion to fibers. Before the optical technology, to receive data from satellites someone at the ground base station would have to wait till the satellite rotates in its orbital path where there is connectivity. To avoid this limitation the optical satellite has an antenna with diffused beam each sending out radio signals carrying important data. The line of sight of an optical satellites telescope is always directed sideways from the direction of elongation.

NANO SATELLITES
Nano satellites are very small in size and the idea of a nano satellite was borne out of the desire to have a satellite in space that is unnoticeable by anyone. Having the size of a human head. Nanosatellites are launched by a parent craft from earth but takes fewer time to launch into space unlike the usual known satellites which are much more bigger in size. The weight of nano satellites also make it suitable as the spacecraft of the future. Just like we have in other systems of technology, nano satellites have all their internal components embedded in a microchip with the same quality and functionality. The major problem hence will be from the aperture and the camera which will both be very small in size when compared to the normal satellites. What scientists are trying to achieve is the reduction of the size of telemetry objects.

Small telemetry objects in space is an added advantage because of the following
Cost effectiveness it save the nation more money if the nano satellite technology can be fully achieved in the near future. Instead of spending so much on the big type which still performs the same operation at equal efficiency.

Launch time In science we always try to beat time down to the barest minimum. The fact that nanosatellites take lesser time to launch into space is enough factor to support its research work.
Finally it saves much space in space and can not be easily detected which makes it suitable for the military operations.

PICO SATELLITES
Pico satellites (Picosat) are miniaturized type of satellites that are very cheap to build and easy to design. It has a small mass of between 0.1 kg and 1 kg. They come with a larger satellite that are used to launch them into space. The Pico-satellite is designed such that all the massive parts usually found in larger satellites are now embedded in a small microchip placed on a panel board.

Applications and Purposes of Pico- and Nano- Satellites
They find their major application in the department of defense and military. Due to it very small size it has become a future technology that defense departments of various countries install in space so as to spy opponent without being detected easily. Unlike the larger satellite which is massive in space and very conspicuous.

The issue of satellite clustering in space has also made it necessary for the advent of minute satellites which occupy less space in the sky. It is inarguably one of the major reasons why scientist opted for the design of a smaller satellite. The purpose of installing Pico and Nano satellites is to counter the rate at which the developing countries are beginning to build and install their own satellites in space. In an attempt to avoid, satellites collision which leads to debris deposition in space, scientist have worked very hard to make sure that a smaller version of the usually big satellites come into existence with the same or even better performance.

Finally, due to the skyrocketing price of constructing a satellite which is a major hindrance for developing countries, the Pico- and Nano- satellites were designed to reduce cost of production also make if more affordable for them. Since space communication contributes a lot to the development of a nation, it was essential for the developing countries to be encouraged to have their own at a very cheaper price.

In summary the purposes for constructing the Pico and Nano satellites include the following
Military and defense, Avoidance of satellite collisions and

CHALLENCES AND ISSUE
The challenges faced by the modern day technology are enormous, enough to discourage scientists. These challenges, besides being discouraged keep the scientists on their toes during research. They keep looking for better ways to better the lot of the world through technology.
We are going to make sure we cover possible challenges faced by each type of satellites.
One of the initial challenges faced was the issue of cost. It was a major problem to make a satellite that will be far smaller in nature to the existing one in order to achieve a reduced cost of production. Use of GPS time and frequency in communications systems improved the local oscillator (LO) stability and enhanced access protocols like open loop Time Division Multiple Access (TDMA). (A. Bhaskaranarayana, B. S. Bhatia, K. Bandyopadhyay and P. K. Jain).

One of the major reasons that led to the reduction of the size of the satellite was the time it required for launching to take place. Even though Higher Institutions and manufacturers still focus on the construction of the bigger satellites, it is logical to imagine that a time will come when all those kind of satellites would be considered out of place and the likes of nano and Pico satellites will become the main types available in the market. It is important to know that one of the limitations of communication satellites is also the large bandwidth required for each channel to have. If radio frequency signals of any type could be compressed this will reduce the bandwidth and improve communication capacity.

It has been stated earlier too that a smaller satellite is easy to design. Designing a big satellite could be very challenging as a lot of criteria have to be taken into consideration. A satellite is not just designed physically like that else you might just find out that your whole new satellite is just a bunch of beautiful edifice that does not work. Most of the components are dependent on each other, therefore to neutralize this challenge thereby saving stress and cost, these satellites are first designed using a software package, and then the simulation process begins before finally constructing the equipment.

Most of these satellites make use of natural sources of energy from the sun. Solar energy radiation is trapped by the solar panel of the satellite thereby converting it to other forms of energy. The by-products of this energy released after combustion are deposited in space atmosphere causing pollution. Man seems to be a major contributor of unnatural materials from derelict large size satellite to fuel deposition that pollute the space (MESHISHNEK).

The awareness on the need and advantages of artificial communication satellites installed in space have been on the increasing trend over the past decades now. When the first satellite Sputnik 1 was installed, many countries never knew the advantages. Today, more countries are involved in the construction of satellites bringing the number of satellites in space to around hundred. The space is increasingly becoming crowded by these satellites. As many more countries get involved, scientists will be faced with the issue of creating enough space to occupy more satellites. That is why the issue of modeling smaller sizes of satellites should be taking seriously.

Have you ever imagined how possible it was for two or more satellites to collide in space This could be caused by interception in the path of a satellite by another.
 The collision on February 10, 2009 between the Iridium 33 satellite and the defunct Cosmos 2251 satellite at an altitude of 470 miles (770 km) significantly increases the amount of space debris in the region of space that is already the most crowded and has the greatest risk of collisions between orbiting objects (D. Wright).
Whether many more collisions will still be witnessed in the near future is left to how soon the nano and Pico satellite come into existence.
Most developing countries are still in the process of launching their own artificial satellites into space. In short Nigeria NigComSat-1 remains one of the first developing countries to launch a satellite into space which is remarkable but how much they can afford to maintain the proper running of this satellite only time will tell. Soon after the NigComSat-1 developed a fault they installed NigComSat-2. By the time other developing countries feel the need to own their own satellite I wonder how space will be left in the atmosphere. This debris are mostly concentrated in areas where there are more satellites located (D. Wright, Space Debris).
Also ozone layer depletion can also serve as a source of pollution in space.

Nigeria seems to be the foremost developing nation that owns a satellite in space. The advantage of satellites in space cannot be overemphasized. Even though there are some limitations to its development, space communication stills remains a major benefit to our world. The world is a better place to leave today because of space communication. This concept has united the global community, making us closer to our families, friends and business partners. The benefits are immeasurable when compared to its challenges which includes space debris deposition.

Many questions would have been left unanswered if it were not for the genius in space communication.

Even though the level that has been achieved in satellites communication is impressive it is still necessary to note and pit in our minds that there is still an urgent need for us to work on the full existence of the newer technology called nano satellites and Pico satellites because the more countries that begin to embrace the use of artificial satellites installation the more encroachment in space will be observed which could cause frequent collision in space. This we dont want to witness.
When you come to terms with the fact that there are far more developing countries than the developed, you will then understand the greater urgency in the need to manufacture mini-satellites with the same level of performance set by the existing ones.

With the rate at which the world is becoming global and technology is taking over the place there might come a time very soon when owning a satellite by a nation become as necessary as human beings having their own personal cell phones.

Though the issue of pollution also seem to be a problem, I will recommend that an alternative source of energy be used all through the daily running of the satellite to avoid further fuel depletion in space.

Development of Management Information System at McDonalds

Executive Summary
An Agreement on the best mix of restaurant information system has been very hard to establish in McDonald restaurants despite the fact that many other businesses are already running their websites efficiently. The best model has been very hard to establish since the food industry is very delicate and cannot be handled like the commercial venture which do not deal with foodstuffs. Nonetheless, this does not mean that MIS is inappropriate for restaurants. The food industry needs a very efficient management approach that will ensure that incase of a problem, appropriate corrective measures are executed as a result of timely detection of the problem. For this to be done effectively, McDonald will have to set in place a systematic monitoring and evaluation program. The MIS will offer a very important framework for managing the restaurant operations.

Developing MIS the management team will establish a team of 15 members to spearhead the project about setting up a management system. The aim of this committee is to establish what technology the company has and the current technology that could be relevant in the integration of the MIS in the management of the firm, and then determine how the firm will acquire the most appropriate technology. The committee is very important in decision making. Some of the existing systems include miscellaneous formats of monitoring, project level monitoring, auditor controller and physical monitoring progress. Efficient implementation of the MIS can communicate competiveness of the business. For this reason, this research was intended to identify the current IT developments in the company, find out the better value trainings and then implement the best restaurant MIS model for success of the firm.

Management information System denoted as MIS is a set of the general controls from the inside of the business venture or any other organization that covers the application of documents, personnel, technology and process by managers, accountants for problems solving like service, product costing or implementing a business wide approach. The management information is very different from other normal information systems since they are applied in the analysis of other information systems in the operation of the organization.  Sometimes they are also referred to as information management systems (Post  Anderson 2005).

Rationally, information systems management can be used to describe a group of information management techniques that are connected to the automation or back up the process of decision making. Like the support systems for decision making, information system for executives, and expert systems (Schultheis 1999).  The normal information systems are not basically designed or intended for decision making process. However, the information management systems are basically for the decision process and are identified as information technology management systems (Ibrahim et al 2008). The IT services are basically user focused and are also different from the Enterprise Resource Planning (ERP) as it incorporates some aspects that do not support decision making process (Lederer  Salmeal 2006).

McDonald Corporation Overview
The company was established in 1940 by Mac and Dick who decided to operate a restaurant in San Bernardino. The company obtains its revenue from the rent fee and also franchise fees. The sales done by the firm consist of the larger percentage of the revenue these are from the restaurants operated by the corporation itself. Other sources of revenue include royalties.  It is estimated that the companys revenue increased by 27 over a period of three years. It is currently at about 22.8 billion and the increase in operating income increased by 9 to reach  3.9 billion (Blumenthal  Haspeslagh 2004). Currently McDonald restaurants are operating in about 120 nations worldwide and they feed close to 50 million people on daily basis. McDonalds has about 31,000 restaurants with a workforce of about one and a half million people (Vaman 2009).

McDonalds Vision  the corporation operates on a vision that it becomes the dominant company in the international food industry. By being dominant actually means that the standards of performance for the consumer satisfaction will be exceptional while at the same time aim at increasing the market share and the effectiveness via the well-situated value and implementation tactics (Burlaud et al 2007).

McDonalds Mission- to serve a very limited type of menu of tasty foodstuffs, faster under a hygienic and welcoming restaurant at a reasonable value to a broad base of clients.  The mission is derived from a well thought strategic plan and vision. The concepts that underlie the mission are as follows very tasty food, a narrow menu, consistent quality, value for money prices, faster services, outstanding consumer care, well-located locations, and international market coverage (Burlaud et al 2007).

Objectives of McDonalds
These objectives are very important because they provide a focus on which several brains can work on to bring out different issues and views. By setting up objectives, the corporation focuses the brains on the particular course and this guides fruitful thinking. Other than acting as guidelines, the objectives of an organization are able to grow into a strategic plan, describing where the company is headed and the resources to take it there.

Management Information System Objectives
The basic model is targeting creation of more profits and it is very simple to put in practice. It is also a characteristic type of website model for doing business online. McDonald restaurants can establish the business by creating an E-Business whereby people can take orders, make charges on the credit cards and purchase raw materials. This type of business model targets clients and are easy to use (B2C). The Management Information System is very important in management of the business especially the food industry. For this reason, the McDonalds management has to adopt restaurant MIS for successful operation of the business (Ansel  Dyer 1999)

Food industry is very dynamic and keeps on varying in terms of clientele preference, prices and is also affected by recession. For this reason, there is need to devise a method that would be very easy to manipulate in case of the changes. The food industry and especially operating a restaurant is very competitive. Given this type of dealing, McDonald has constantly been seeking means of improving their sales and hence to increase profitability of the business. MIS will improve companys efficiency.

The food industry has some equipment that can be integrated in technology so that ovens and fryers are automatic in terms of specified time, temperature and quantity. The process of decision making like making forecasts or making orders can be made automatic by the use of information technology. The restaurant management information systems are specific for supporting the decision process and finance reporting in the management of restaurants (Stair  Reynolds 2007). This Restaurant MIS objectives are to be successful in several aspects of operation including
Automation of the initial manual procedures.
Production or making reports that enhance decision making in the managerial position.
Reducing repetition via integration of the system.
Enhancing communication across the industry from managers, individual restaurants and to the headquarters.
Cutting down the time for food delivery.
Making of reports that would assist the management in making sales, finding labor managing the cost of food.
And for production of forecasts that help the management to order, plan food production and schedule labor .
It is important to note that the MIS focus on the management yet in the restaurant business profits or losses are made at the restaurant level and not the executives. By efficiently deploying Management Information Systems at the operational level, Mc Donalds might be able to expand and control its exceptional operational resources and capabilities (Morrison  Laffin 2009).

Sales  the Company aims to increase the revenue obtained from the direct sale from the restaurants it operates plus other sources like royalties and rent (Burlaud et al 2007).
Growth  the company also targets to grow and spread to new markets as it has set the trend for globalization and venturing to the international market.
Profit  the corporation aims at increasing the amount of profits it gets by ensuring that it offers its services to a wider base of customers (Clemons  Row 1999).
Customer satisfaction  the firms aim at providing exceptional services to customers to satisfy them in terms of hygiene and good quality of food.
Plan  the basic plan on which the firm operates is to offer great service to its clients and an outstanding experience in every restaurant across the globe every time. The Corporation also strives to create good relationships with their clients by using different activities like children play grounds (Clemons  Row 1999) .

Corporate Strategy

Long-Term Strategy The strategy at McDonald focuses mostly on the customers and even in their daily operations, customers service comes first. For that reason, the customers still form the basis of long term strategies (Haag  Phillips 2007). The corporation understands that to remain on top of the business it has to attend to customers in a special way. By this, the company understands that it has to perceive the need of the customers and work towards giving them exactly what they want in an ethical manner. So, the long-term strategy is to maximizing profit and customer satisfaction by implementation of Management Information Systems and Information Technology.

Management Information Systems Strategy - Bearing in mind the dynamic nature business ventures are taking today. There is dire need for satisfaction in terms of information any time and everywhere. There are several necessary components for the effective functioning of IT in the business that the firm can choose from and they are highlighted below.

The enterprise management systems (Fombrun 2003).
Application Technology customer-server website facilitated function.
Security -- proxy servers and firewall servers, (The Financial Express 2009).
Network  intranet, internet, extranet, Local area network (Vaman 2009).
Application solutions- oracle financials, oracle 81 etc
The use of ERP- Enterprise Resource planning, (The Financial Express 2009).

Business Model The Franchise Model- the business operates about 85 of its total capacity as rented out franchisees. The remainder (15) is what the company itself runs directly as McDonald Corporation. The company has in place a very inclusive structure of training and scrutinizing the franchises. This makes sure that they follow the quality services that are set by the companys standards, adhere to cleanliness level and appreciate the propositions offered by the firm to its clients (Clemons  Row 1999).  MIS and IT implementation will be elaborated in these processes. It should improve communications among managers, headquarters, restaurants and decrease duplication of efforts through system integration.

Enterprise Management System
An enterprise can be defined as a large business organization that includes all the players or the stakeholders. It is also a terminology used for corporate entity. They range from a small coffee house to multinational corporations like TATA. A group of people with common objectives can also fit in the definition of the enterprise (Laudon  Laudon 2001). McDonald corporation enterprise is a multinational food dealer that retails foodstuffs directly in restaurants and also offers franchise from some of the restaurants. Since technology is spreading all over, the firm has to move with time and also make its arrangements of getting the management information systems so that it is able to manage its issues properly (Thompson  Strickland 2008).

The ERP System. ERP is part of software modules that support and maintain the business activities that are being undertaken in the critical back office processes. For instance, in a manufacturing plant, the firm will need to track the sales progression, statutes of inventory invoicing and so on (Coyle et al 2009). There are applications that will enable several functions to be performed. Packages, suites, enterprise applications and systems will be connected on a single integrated type of system. Implementation of such underlying theory is via modules which can be integrated. The main objective of ERP is to integrate all aspects of the Company and do away with complicated connections between the computer systems. The architecture or design is customer - server and makes use of incorporated design of OO method in designing and developing the entire system (Davenport 1994). 

The major advantage of the ERP system is that it offers an integrated solution for every requirement of the business. The systems also take into account the hierarchical powers of the organization. Essentially, the ERP solutions are founded on the Windows NT and UNIX stage (Thompson  Strickland 2008).

McDonald will be able to integrate the following modules into its systems as they form the major aspects of the ERP finance and accounting marketing and distribution personnel management inventory and purchases, control and planning (Laudon  Laudon 2001). These modules will in turn offer solutions to the following functions analyses data capturing from business transactions validating the data transactions and updating accounts and making reports.

Benefits McDonalds will gain the following benefits from the ERP implementation the management will have an easy time in reaching decisions of the entire organization or even at the lowest level. This is because the information presented is from every part of the organization all broken down in a simple way to understand whatever is happening at every level. It hence saves on time. The system makes it possible for the whole organization to share information (same information), and the interpretation is likely to be the same (Coyle et al 2009).

Risks despite the pupated risks, the module also presents some risks. They include being tied up to a single vendor very limited in terms of flexibility as there are few options it may improperly instigate use of generic processes may improperly cause the organizational structure to change and it is also complicated with regard to mapping and standardization processes across the venture (Laudon  Laudon 2000).


Management Information System and IT at McDonalds
It is very critical for McDonald to apply the MIS on its operations bearing in mind that the business will keep on changing and adapting to newer technologies every time in order to remain competitive. Sophisticated means of management should be adapted so that the decision making is based on accurate facts and logic with evidence for back up. There is software that can be installed on personal computers for this purpose (Laudon  Laudon 2000). It is important for every manager to be able to find out price and access schedules, do auditing of inventories and company assets and carry out forecasts as well as monitoring the organizations performance while in hisher office. Many other companies are employing such tactics and allow their managers to access information on the computer just on timely basis (OBrien 1999).  

Conventionally, food service industry has had a very weak in house system especially the accounting programs as seen by the wearisome manual procedures. The outcomes of such processes have always been that the companies end up desperate with poor cost management (controls). It happens that the cost information is large and out-of-date before manual computations are complete (OBrien 1999). The MIS for food industry will get rid of such problems. MIS provides the following Modules inventory control, general ledger, sales analysis, bank reconciliation, recipe control and sales analyses.


The IT Infrastructure
IT Manager As identified earlier, the business shall have a project or rather the IT manager who will be the overall executive officer. Other sub-sections will be reporting to several main departments all taking the umbrella design (Haag  Phillips 2007). 
The Service Management Team  The team will be comprised of the previous parties who took part in the initial research of the ITIL framework plus other staff who will be approved by the IT department to offer neutral inputs. They will include director human resource, director operations, finance director and marketing director. Since the business is at its initial stages, the team shall not take more than five people as this would jeopardize flexibility, movement and further progress. The team will have members who are qualified with the relevant skills to execute their job (Haag  Phillips 2007).
The Performance Teams There will be implementation teams for each of the specified procedures in the ITIL approach and this will include selected individuals of the management team and the chief personnel in each section. Management and leaders from areas concerned will have to be included as the decisions and changes proposed will have an impact on the workforce under their jurisdiction (Haag  Phillips 2007).

The Steering Committee This committee will comprise of the information technology managers who will work to bring in different parties as required and to reach rational decisions (Ansel  Dyer 1999)
The Design Team This team will provide information on critical matters that may arise and they will include some staff members from sections affected by implementation forming the internal reference team while academicians and general staff with habitual contact with the IT section will form external reference team (Stair  Reynolds 2007).
The Reporting Outline The implementation team will report to the management team at definite intervals. Gathered progress information, concerns arising and other issues will be reported on monthly basis via the Director of IT services.  Any concerns will be addressed by IT managers for strategic conclusion and policy formulation otherwise the daily issues that may affect performance will be addressed by the director of information technology (Haag  Phillips 2007).

Internet Presence
With increased use of IT in the western world, there is dire need to implement the use of ITIL framework in the provision of IT services in McDonald restaurants. This is mainly because the users do not concern themselves with the infrastructure aspect of the services rather the services themselves, while the business owners also dont bother about the maintenance of the components of the IT infrastructure. Due to these kinds of negligence, there has been series of problems in the provision of IT services which include cases of miscommunication and misunderstanding which results to the poor service provision by some restaurants (Burlaud et al 2007). Presentation of the services on the internet is very important ands since the business is a chain, the website will have to be the same for all the branches involved. However there will also be direction to specific location through the country of operation. Any alterations to the normal flow or different way of operation will be communicated.

Another major concern is the fact that due to the increased number of people using the IT services and those who would like to eat out, there has not been a corresponding realignment of the clients need of IT and very small extent of service integration and support services have been provided. If the IT services are provided in a way that would be as consistent, commonsensical, synchronized business would to a great extent enhance confidence of consumers who are IT services (Burlaud et al 2007).

The adoption of the ITIL approach means that the services provided will be supported but ITIL and ensure they are tailored towards the consumers needs and design or develop services that are consistent with the customers needs as well as ensure that the business objectives are met (Edwards et al 2005).  Professional personnel will be used to make sure that the ITIL framework principles are effectively and appropriately provided as this would enhance and develop customers initiatives rather than functioning mainly as a support function.

From the diagram above, the Business viewpoint is close to the company. The IT infrastructure which includes deployment, operations, technical support, design and planning are close to technology (software and hardware and networking). Planning, implementation, and service management tackles challenges in those specific jurisdictions. Service management involves service support and delivery (it contains the incident management problem management, service level, change management, release management, financial management, capacity management and configuration management. Security management is concerned about cross cuts. Applications management takes care of designing, operations, optimization and deployment.

Converting to the New Model
The first step is to find out what the company has and what is applicable to the market. This will enable the firm determine the type of technology it is lacking. After finding out, the workers compliance is accessed. Many workers are usually not oriented to the new technology and some would resist. Training would be offered to train them on how to use the new technology. The components to be changed include the software, the human resource hardware data and procedures (Burlaud et al 2007).

The information system management requirements will endure that the systems meet customer specifications and needs. The user management obligations on the other hand will ensure that the customers will be involved in the process and still remain dedicated to the use of the new systems. The critical factors for implementation will include open communication, organization trust, consumeruser dedications, financial support and common view of the new strategy (Blumenthal  Haspeslagh 2004).

Socio-Technical Systems Change Model
When complex human organization such as McDonalds makes the decision to pursue redesigning itself, it must first choose the type of change model it is going to use to do the redesign. If the organization chooses to use socio-technical systems methodology redesign, it must have careful planning, widespread involvement, adequate resources, strong management support, and skillful facilitation (Pasmore 1988 pg109). An organization that achieves these items can pursue its redesign with confidence that it can reinvent itself into a more competitive and successful organization. This model of change is suitable for McDonalds as socio-technicalorganization.

The organization that uses a socio-technical systems methodology model needs to realize that the process will not be a quick fix to organizational woes. The redesign can take anywhere from six months to three years to complete (Pasmore 1988 pg110). Based upon this understanding, the organization can achieve desired change by approaching the redesign with a realistic timetable. Organizations need to understand that the process cannot be rushed, but the organizations must invest the time needed to produce the desired change that they want to achieve. The following is a diagram of the socio-technical systems methodology model.

This change model represents the lens that is used to examine McDonalds MIS and IT change model. The examination will consist of looking for evidence from McDonalds IT change methodology that is consistent with the socio-technical systems methodology change model. An organization that understands the assumptions will be in a better position to use the socio-technical systems methodology to redesign itself successfully. Pasmores redesign and implementation  model does contain a weakness based upon the models rigid linear appearance. The model appears to be a linear systematic model and not a systemic model. When viewing McDonalds change model through this lens, we have corrected for this heavily systematic linear orientation by realizing that an open system must be viewed systemically and not by a rigid step-by-step process. The following explains the socio-technical systems methodology change model that is used for McDonalds MIS implementation and change.

The Phases of MIS Change Model for McDonalds
Phase 1 Define the Scope of the MIS to be implemented
Phase 2 Determine Environmental MIS and IT Demands
Identify Key External Stakeholders
Determine Current and Future IT Demands
Decide on Appropriate Responses to Demands
Derive Organizational Goals
Phase 3 Create the Vision Statement and Charter
Clarify Values and Outcomes, Draft Statement
Charter the Change Effort
Phase 4 Educate Company Members
MIS Education
Skill Training
Phase 5 Create the Change Structure
Create the Design Team
Intergroup Meeting Between Design Team and Steering Committee
Develop Communication Strategy
Educate the Design Team and Steering Committee
Develop Involvement Strategy
Develop Resourcing Strategy
Develop Change Strategy
Phase 6 Conduct the MIS Implementation Analysis
Identify Resources to Assist in the Analyses
Provide Training in Analytical Methods
Analyze Systems
Review With the Steering Committee
Share With the Rest of Company
Phase 7 Formulate MIS Design Proposals
Review IT Design Inputs
Clarify Desired Outcomes
Formulate Specific Proposals
Examine Systemic Impact
Perform Cost-Benefit Analysis
Select Most Viable Proposals
Review With Steering Committee
Phase 8 Implement Recommended Changes
Communication and Review Proposals With the Rest of the Company
Communicate and Review Proposals with the Upper Management
Create Implementation Plan
Train Employees and Supervisors
Execute Implementation Plan
Phase 9 Evaluate Changes and Redesign as Necessary
Develop Evaluation Methodology
Collect and Review Data Against Goals
Redesign if Needed

Timeframe and Costs of Implementation
The implementation and adoption of new IT and MIS infrastructure is very confident and complex process, especially in such a global company as McDonalds. It should elaborate efforts at all corporate levels of the company including the lower staff. Substantial resources would be elaborated in this process. It can be estimated that about 8-10 of current companys budget should be directed on MIS implementation. The timeframe for MIS implementation can be estimated as one year. Below is the approximate timeframes and costs of MIS implementation at McDonalds.

Recommendations for Improvement
For a business to thrive it has to address the deficient areas in the market and this would ensure that venture gains competitive advantage over the existing competitors. Research has indicated that most internal processes comply with the best practice requirements as presented by the ITIL and as such would not require so much modification. However, there are some areas that require immediate transformation in order to execute better services and also make sure that the clients expectations are met. The business will also have to address the concerns of the providers of the services and other players implying that the issues to be addressed should not be exhaustive as other concerns are likely to be identified as time goes by or other concerns which require urgent implementation may come up (Haag  Phillips 2007).

The already existing concerns include- dependence on one support staff this is a situation where staff turnover and absenteeism greatly affect the delivery services and also the level of the same services is reduced some problems include logging of small percentage of incidences there is also limited call categorization systems and prioritization procedures and this usually results in reduced meaningful management of the information that can be obtained from the database software used for configuration management deficient description of the services customers expect there is lack of standardized procedure of documentation and this has led to fragments of information, held in several locations and some are even in accessible to the users and insufficient identity of the internal requirements for services hence leading to decreased level of services and confusion in the department (Stair  Reynolds 2007). These are just some of the issues that need to be addressed. Their prioritization will take place when the business is initiated.

The project will strive to improve the Quality of Information Technology support for use in the restaurants among other uses by putting into operation the components of the ITIL approach whenever suitable and putting them in line, in agreement with the calculated objectives of the business to be started. The firm will be a pace setter in IT services, IT training and consultation services in management of IT services and information safety.

This will address what is initially purposed to be accomplished during the implementation of the project. Each aspect of the objectives will out of necessity cover its own capacity, aims and deliverables To provide quality and efficient IT services that meet the ITIL standardized requirements to the clients The first aim to attain a unified client service culture across the entire IT industry in the McDonald restaurants Identify the critical components of ITIL approach which are relevant to the businesses in the market and ensure they are enhanced and available Make priorities of the components and implement them in the preset time limit ensuring that the quality of services is measurable Provide the standardized best Information Technology practices in all areas of the information and communication technology industry The business will also initiate a tradition of a cycle of sustained improvement of quality or services delivered The business will help to alleviate the inefficiencies in IT departments of other branches by insuring the effective use of existing services and resources while embracing the best changes and alternatives that may arise The business will aim to effect transformations that will align with the ITIL performances without any big impact on the services in existence or the level of service provision (Vaman 2009). 

It would be also recommended the following to change professionals and organizational designers. To be successful in implementing the technological change across a global organization such as McDonalds they need to do the following
1. The change initiative can move forward if, and only if, the senior executive officer sponsors the change. If the executive officer does not back the change initiative, then the change should be abandoned.
2. Training should be made mandatory to ensure that everyone in the organization understands how to interface with the new technological system.
3. When implementing the new technology, the old technology should be taken away and removed. This leaves no alternative for the employees of the organization to fall back upon.
4. A push strategy should be employed for the implementation strategy until more than 50 of the organization has received the new technology. After a majority of the organization has been implemented a decision to shift to a pull strategy can occur. This will force the remaining portion of the organization to need to have the new technology in order to be in synch with the rest of the organization.
5. MIS will support the change while more social departments such as Marketing and Sales will be against it. With this being the case, it is vital to obtain senior level Market and Sales personnel to put their support behind the change initiative.
6. It would be better to make communication a top priority of the change methodology to explain to all the employees of the organization what is coming, why it is coming, how it is coming, and when it is coming.
7. It would be better to ensure that non-IT personnel are heavily involved in the design of the MIS change initiative. With the MIS department supporting the initiative for the value added benefits, asking the departments that may resist the change initiative will contribute to the success of the design.

Summary
This paper identified key MIS points that should be redesigned and implemented in McDonalds. The MIS Objectives and Strategy are identified and assessed. Organizational structure and IT infrastructure along with workflows identified and analyzed. Summary of major IT processes is presented. Enterprise Management System of McDonalds are described in detail. The MIS Change and Implementation plan and methodology are also specified. The reasonable timeframe and cost estimations are designed. Finally, the list of recommendations for improvement of implementation process is presented as well.
This dissertation aims to obtain optimum levels for the variables investigated to limit visual fatigue for the process plant operators who work in interaction with Visual Display Units. This study aims to assess the effect of lighting in control rooms by examining how lighting in the visual display units is related to ocular fatigue of the operators. The study will focus on the operators who work in control rooms of process plants with an aim of using a systematic approach to address the increasing cases of fatigue in control rooms.

The results of the analysis show that lower lux levels are not good for operators since a higher percentage reported fatigue-related feelings. In addition, higher levels are also associated with causing effects that are related to operator fatigue.

CHAPTER 1    INTRODUCTION
1.1    Aims and Objectives
This dissertation aims to obtain optimum levels for the variables investigated to limit visual fatigue for the process plant operators who work in interaction with Visual Display Units. This study aims to assess the effect of lighting in control rooms by examining how lighting in the visual display units is related to ocular fatigue of the operators.  The study will focus on the operators who work in control rooms of process plants with an aim of using a systematic approach to address the increasing cases of fatigue in control rooms. This aim will be complemented by the following objectives which the dissertation aims to achieve
Enhance knowledge and understanding about the effect of lighting on visual fatigue for Process Plant Visual Display Unit (VDU) Operators
Identify and discuss factors that interact with lighting to contribute to fatigue for operators in process plant visual display units
Identify barriers to the state-of-the-art designing of the control rooms
Increase an understanding of the influences that architectural designs can have in the effect of lighting on fatigue
Gain an understanding of the current efforts made in order to improve the designing of the control rooms
Define and describe current state of the control rooms with respect to the design of the VDUs
Improve public awareness about the relationship between lighting in control rooms and fatigue for those who work in them and thereby improve or complement fatigue management and training operations
Recommend interventions designed to decrease the occurrence of fatigue resulting from lighting effects in the VDUs through proposition of fatigue management strategies
Complement existing bodies of research on effect of lighting on ocular fatigue for Process Plant Visual Display Unit operators

1.2    Relevance of the study
The study by the Cardiff Research Program carried out in 2007 assessed fatigue management in the seafarers with an aim of developing best practice proposals that are apposite in addressing problems of fatigue in the ship type and trade. Measurement of fatigue was met with challenges during this study due to lack of standards in the measurement of fatigue- that is to say, there were no standard measures for fatigue. This challenge made it difficult for any meaningful comparisons to be made with an aim of evaluating the results of research studies that had already been carried out.
Nevertheless, the research was a breakthrough into understanding the relationship between lighting and fatigue among seafarers. In other follow ups, researchers from different corners of the globe sought to address the in-depth understanding of the causes of fatigue in control rooms across the board. All this has been done in an attempt to understand the aspect of fatigue as an imperative challenge to performance in control rooms.

Around the world, fatigue is identified as one of the major causes of accidents in plants, control rooms and even among drivers and pilots. For instance, a research study done by the Adelaide Center for Research in Australia found that about 20 of accidents occurring in plants result from fatigue of the operators. Health and safety regulations in Europe require that the visual display units be included with measures that are instituted to reduce the effects of fatigue due to lighting in the rooms.
In the USA, the Pipeline and Hazardous Materials Safety Administration (PHMSA) requires that all control rooms be established with control management systems that reduce risks of operators being exposed to risks of fatigue. Nevertheless, most of the research works carried out has been instituted to assess the effect of fatigue on operator performance and few of them have endeavored to evaluate the relationship between the lighting system in the control rooms and the operator fatigue even though the findings indicate a positive relationship between fatigue and incidences of accidents.

Estimates of the predominance of eye problems related to using visual display terminals (VDTs) differ enormously subject to the sample tested as well as the research methods deployed. However, a majority of scholars and authors consent that eye problems are frequent among VDT users.
This dissertation discusses the relative contributions of the nature of VDT displays, design of control rooms, work-practices and optometric aspects in reference to body of literature that is now availed on this subject.

Operator fatigue is a vital safety concern that is experienced in all modes of plants with control rooms. Fatigue can instigate sleepiness and drowsiness, reduce the ability of workers to work safely, and thereby intensify the probability of fatalities and injuries.

1.3    Ergonomic design principles
Thorough analysis of the tasks undertaken by each operator, general principles of ergonomic design should be used to establish the safest and most proficient methods to make best use of human performance. Operators body postures, ocular comfort and movement must be taken into account for proper ergonomic design of a control console to be achieved.

In addition, environmental factors are also considered and these include noise in the surrounding and illumination levels. If an ergonomic design is properly selected it should aim to ease operator stress and to improve alertness at all times. This will allow the operators to stay focused on their tasks. Control consoles should follow the following general principles in the selection process.

Postural variability given that the control room operators stay seated for longer period of time, they can easily experience reduced levels of alertness and increased risk of fatigue due to reduced blood flow. The implication is a feeling of sleepiness. The control room consoles should therefore be stationed in a manner that allows for change of posture by incorporating consoles that have adjustable heights that will give the operators the option of standing or sitting while at the controls.

Visual comfort there is increased interaction with computers due to the fact that there is increased use of automation in process plants. This implies that the positioning of the visual display units is vital to the successful execution of duties by the control room operators. Therefore, optimal placement of display screens will ensure reduced eyestrain.

Vertical height There are two schools of thought concerning the best possible vertical height for VDUs. The first school of thought argues that the monitors should be placed at a level that is in same position with the eye level.

The widespread use of computer technology in the workplace has resulted in the exposure of workers to the associated hazards related to these relatively new tools. The human interaction with computers requires methods for input such as keyboards and mice can lead to carpel tunnel syndrome and output via visual display Units (VDUs) can lead to visual fatigue.

Process plant VDU operators are required to interact with VDUs users for long periods (as much as 12 hours continuously on a work day) and frequently complain of symptoms such as visual fatigue, head aches, nausea, musculoskeletal pain, and stress. These symptoms can lead to operator error which has resulted in high cost losses in production in industry over the years.

Recognising that light is the medium by which information is transmitted from the VDU to the human eye for interpretation this research paper will focus on the effects of lighting on visual fatigue for process plant operators. Current literature has failed to come to a consensus on a definition of visual fatigue and researchers have relied on a combination of optometry and task analysis to conduct studies that promote their methodology in ascertaining levels of visual fatigue.

For the purposes of this research paper, no new method for measuring visual fatigue will be developed but instead the unique environment of the process plant control room and the variable conditions within that particular environment will be considered.  With these variables an experiment using process plant operators will be conducted to validate the hypothesis that lighting affects visual fatigue. Some of the variables that will be considered in the factorial experiment are
Ambient Lighting levels

This will be measured with the use of a calibrated light meter and direct discrete readings are expected. Variation in lighting will be done with the use of rheostats or dimmer switches so that the lighting level can be varied.
Glare in the control room that affect viewing of the VDU
Glare will be measured based on the difference of the image being looked at minus the reflectance from adjacent sources, both readings are expected to be measured via light meters.
Gaze Angle to VDU
Gaze angle will be calculated from trigonometry using measurements of eye level above plane, VDU centre level above plane and the distance from the eye to the centre of the VDU.
Time on Task
Time on task will be measured by stopwatch tasks on the VDU will also be conducted at the same time to ensure the test operator maintains focus on the VDU.
Display Technology
VDUs have evolved over the years and technology across plants varies from Cathode Ray Tube (CRT) screens to Liquid Crystal Display (LCD) screens that can be either passive or active in nature. Review of manufacturers specifications will be done but no active experimentation will be considered other that the varying of actual VDU types.
Resolution
The resolution of VDUs also differs amongst manufacturers, again manufacturers specifications will be reviewed and screen resolutions will be varied
Luminance Contrast

Adjustment of contrast will also be done on VDUs and measured using light meters providing direct data. It is expected that there will be some interaction with glare for this variable. 

Based on the results of the experimentation, optimal levels of each variable are expected to be identified. These levels may or may not be the same as currently exists as a best practice right now however the major finding should be, with a reasonable statistical confidence, that lighting effects impact on visual fatigue.

CHAPTER 2    Literature Review
A substantial amount of research has been carried out in an attempt to establish the correlation subsistent between plant accidents and operator fatigue. In most of these studies, it has been pointed out that the results point out to a strong connection between operator fatigue and incidences of accidents in the control rooms. In line with a research carried out in the US by Lamb and Calhoun (2006) under the umbrella of the Michigans Naval Architecture and Marine Engineering Department, human fatigue was found to result from many factors including stress at work, hunger, or boredom among other factors. The researchers found out that fatigue is a serious cause to decreased human performance. Another important finding that the researchers achieved was that accidents occur at a period when fatigue has achieved maximum buildup. Although the research was specifically centered on the sleeping environment of the ship crew, it highlighted how proper lighting in the sleeping environment helps in reducing fatigue among the shipboard operators.

The research carried out by Duffy and Chan (2002) sought to delve the effects of virtual lighting on the operators performance and eye fatigue. In the research, the researchers were determined to find out if different lighting conditions give varying levels of eye performance and eye fatigue. The results of the study pointed to a significant connection between luminance levels and fatigue or eye performance.

In a more recent study, Shen, Shieh, Chao and Lee (2009) sought to underscore the significance of proper luminance on curbing visual fatigue and improving performance. The study is a breakthrough into the understanding of the underlying relationship between change in illumination and eye strain or eye fatigue. Although the study mainly focused on the luminance of the electronic paper displays, it gives a profound understanding of how proper lighting is needed to improve efficiency. In the findings, the researchers concluded that the effects of luminance and the source of light do not pose a statistically significant effect on the visual fatigue though the actual illumination had a strong positive correlation with the performance. That is, efficiency improved when illumination was increased. They therefore recommended that electronic paper displays should be put at greater illumination of 700 lux or higher.

The study by the Cardiff Research Program carried out in 2007 assessed fatigue management in the seafarers. This study aimed to develop best practice proposals that would be apposite in addressing problems of fatigue in the ship type and trade. Measurement of fatigue was met with challenges during this study due to lack of standards in the measurement of fatigue- that is to say, there were no standard measures for fatigue. These challenges made it difficult for any meaningful comparisons to be made with an aim of evaluating the results of research studies that had already been carried out. Thus, such were the gaps that the previous research studies encountered.

Nevertheless, the research was a breakthrough into understanding the relationship between lighting and fatigue among seafarers. In other follow ups, researchers from different corners of the globe sought to address the in-depth understanding of the causes of fatigue in control rooms across the board. All this has been done in an attempt to understand the aspect of fatigue as an imperative challenge to performance in control rooms.

The research study carried out by the Adelaide Center for Research in Australia found that about 20 of accidents occurring in plants result from fatigue of the operators. Health and safety regulations in Europe require that the visual display units be included with measures that are instituted to reduce the effects of fatigue due to lighting in the rooms.

Nevertheless, most of the research works carried out has been instituted to assess the effect of fatigue on operator performance and few of them have endeavored to evaluate the relationship between the lighting system in the control rooms and the operator fatigue even though the findings indicate a positive relationship between fatigue and incidences of accidents.

CHAPTER 3     METHODOLOGY
This chapter gives the methodological approach adopted in achieving the objectives for topic under study. It highlights the intended statistical approach as well as research design and tools that are to be used in achieving the desired objectives. Fatigue is of vital concern and therefore forms integral part of the operator performance in the control rooms and VDUs. Nevertheless, fatigue has a multi-faceted implication and therefore calls for close examination of various issues that relate to this vital element in the performance of operators in their day-to-day business.

3.1    Restatement of the Purpose of the Study
The purpose of the study is to assess the effect of lighting in control rooms by examining how lighting in the visual display units is related to ocular fatigue of the operators so that it can propose optimum levels for the variables investigated to limit visual fatigue for the process plant operators who work in interaction with Visual Display Units. The various variables examined include difficulties in seeing, strange feelings in the eyes, feeling of numbness, feeling tired in the eyes, feeling dizzy and headache. Since the study mainly focuses on the effect of luminance on visual fatigue of operators, it only restricts to constructs that are related to eye strain and fatigue.
The research study employs a random sampling strategy which is also referred to as randomized selection in the study. It employs a true experimental research design where, both the control and experimental groups are selected from the same target institution. This is done so with an objective of reducing the biases that result from effects of extraneous variables (Mulaudzi 2006). Logit statistical method is applied in the study as a primary method of carrying out the analysis of the effects of the youth reentry specialist program.

This method of approach finds good application especially where the study involves estimation of probabilities of success or lack of success from different perspectives. Conditional probabilities are thus able to be established through evaluation and analysis of different characteristics or variables under consideration (Denzin  Lincoln 2000).

There are characteristics of the research study that makes it to fit under experimental design. First of all, it uses a systematic manipulation of data collected to achieve the desired objectives of establishing the relationship between the different levels of luminance on the various variables that relate to fatigue of the operators. In addition, various variables are manipulated in various ways to determine their effects on others or their impact on the subject under investigation. For instance, the levels of lux are manipulated through simulation to determine the resulting effect on the operators feeling of numbness, tired eyes, and strange feelings in the eyes among other specific constructs that have been identified to relate to fatigue.

Another characteristic that makes this research design to effectively fit its category is the fact that the researcher used controlled testing, for instance, through selection of samples from the same institution. This is done to establish an understanding of the real relationship subsistent between luminance and fatigue. It is also done to control for the possibility of incurring biases due to extraneous variables.

Lastly, and of course as a fundamental point, the main prerequisite for an experimental design is met when the researcher uses random assignment in the sampling procedure. Random assignment gives the researcher an opportunity to provide each participant in the target group an equal chance of participation or representation.

With respect to strengths and weaknesses of this type of design, an experimental design has a strong capacity of enabling the determination of the cause-effect relationship in the variables under study or the variables being manipulated and controlled. On almost related ground, the fact that manipulation and control involves selection of the variables before they are subjected to experimental aspects of manipulation, control and observation, if the right variables are not identified, the entire research would be rendered as good as a waste of research time as it will not bring a meaningfully intended contribution (Bryman  Bell 2007). The experimental design used in this research study will therefore help the researcher to assess the following

The effect of lighting on visual fatigue for Process Plant Visual Display Unit (VDU) Operators
Factors that interact with lighting to contribute to fatigue for operators in process plant visual display units the influences that architectural designs can have in the effect of lighting on fatigue

3.2    Validity
3.2.1    Internal Validity
Internal validity of the research is low as with most field research. A number of factors are likely to impact on the response of the respondents. Some of the factors that have been identified to cause an impact on the internal validity of a study include the following. First, reactivity effects or the Hawthorne effects may have an impact on the internal validity where they respond, not because of the procedures of the study but simply as an independent response or reaction to being studied. For instance, in the Hawthorne study, the researchers found though they had created different environments for the employees, the workers productivity just kept increasing. They hence made a conclusion that the workers were merely responding, not because of the experimental conditions that had been created but because of their awareness of being studied.

Another notable threat to internal validity is selection bias. Since participation in the study by respondents is voluntary, selection bias is likely to affect the internal validity. Wood and Haber (2009) note that in most studies in which target respondents decide themselves whether they wish to participate or not face the impact of selection bias. Instrumentation is also identified by the scholars as another factor that affects internal validity. In this threat, any change or alterations made in the measurement of variables or changes in the techniques of observation may justify changes in the measurement that is ultimately obtained. A good way of dealing with this threat is to ensure consistency of the instruments used and techniques applied in the study. As such, the lux levels must be maintained at an exactly desired level for all the participants.

Another threat that may cause considerable impact on the internal validity of this research study is hypothesis-guessing threat. This threat is exhibited where the respondents base their response and behavior on what they perceive the study to be about hence responding as a reaction to the study rather than just response to the survey program. The researcher will minimize the impact of this threat by clarifying to the respondents the concepts of the study before the actual study commences.

A history threat may also pose a threat to the internal validity of the study. Given that the study examines effects of luminance on ocular fatigue in a large organization based on various levels and variables, historical events that might have taken place in the recent past may influence the outcome. This threat to internal validity is often experienced where such interfering events happen between the pre-test and post-test.

Since the researcher intends to make a prior request to the administrators in the organization under study and to the relevant authorities, another threat that may come in play is the testing threat where participants carry out their own research which may affect the actual study by acting as a pre-test study. Mortality threat may also hamper the credibility of the internal validity where respondents drop out of study leading to an inflated measure of the revealed effects.

These threats will be addressed through randomization or randomized selection, careful and systematic consideration and elimination of alternative causes of particular responses as much as possible and putting plausibility into consideration.

3.2.2    External Validity
The external validity relates to the extent to which the results of the study can be generalized to particular populations, settings or times for instance, where there is true random sampling of the respondents with random assignment. It also relates to the extent to which the results of the study could be generalized across particular populations, settings and times. This is a serious threat to external validity since the results of the study may not hold across all groups despite the use of true random sampling. The salient threats to external validity of the study are related to the extent of generalization that can be drawn from the study. However, to rule out effects of extraneous variables, randomized selection is done. First, whether the results of the study can hold for all times is a threat that sprouts from the interaction of history and treatment. Secondly, the interaction of selection and treatment poses the uncertainty of whether the results of the study stand across all groups. Lastly, it may be uncertain whether the results of the treatment may hold across all settings.

In order to reduce the impacts of the threats to external validity, the researcher has identified the target population for the study. In addition, the researcher will ensure that the samples sampled out are representative.

3.3    DATA ANALYSIS
3.3.1    Statistical data analysis
Items or questionnaires that do not receive response will be coded as missing values. Scale values will therefore be calculated as the average or mean of the single items. All items are assumed consistent with characteristics of a normal distribution.
Descriptive statistics and Analysis of Variance (ANOVA) was performed to evaluate whether there is significant relationship between fatigue-related factors and changes in the lux levels (Ayelet, Lingard  Levinson 2008).  The relationship between the various variables will be measured through the Pearson product-moment correlation coefficients (Grinnell  Unrau 2007).

A run of frequencies for the responses is carried out to determine the variability of the responses and the centrality which enable a conclusive report to be drawn thereafter from the results. The standard deviations will be used to measure variability in the responses of the operators concerning various constructs being investigated. For that matter high standard deviations will be interrupted to imply that there are high variability in the reports.

The statistical package for social sciences (SPSS Version 16.0) is used in all the statistical analyses carried out.

RESULTS
The following are the results of the analysis of the experiment that was carried out. The researcher analyzed the data with respect to the frequencies and also carried out an analysis of variance (ANOVA). The analysis of variance or ANOVA was helpful in offering a succinct comparison of the various means recorded in the experiment at various levels of lux. As such, this analysis helped to rule the possibility of the difference or variations in means being caused by any other factor other than chance (Stevenson 2008).

In order to enable both intra and inter-variable comparison, the analysis also included variable splitting where the results of the analysis were split on the continuum of the start of the session and the end of the session and then with respect to the various lux levels. In addition, the ANOVA was of particular help since it helped in testing if the operators had the same averages at different lux levels for the different variables that were subject of assessment in the study. Although this hypothesis was not actively proposed for testing as part of the hypotheses, testing for it enabled the researcher to rule out differences in sample characteristics to be the cause of any difference in means as it would arise after analysis.

The results of the analysis had various implications. The analysis for the lux level at the start of the testing period produced a lux mean of 321.43, with a standard deviation of 125.357 and a variance of 15714.286. at the same period, a run of frequencies for thee various variables indicated that most of the operators exhibited a dizzy feeling with a mean of 4.86 with a standard deviation of 2.0222 and this was followed by those who experienced headache at the same lux level. An average of 4.64 of the operators exhibited a headache and a standard deviation of 1.833. The mean for those who felt numb at these aggregated lux levels was 4.11 with a standard deviation of 1.637.

However, analysis of the percentiles at different lux levels showed that when the operators were exposed to a lux level of 500 the number who reported a dizzy feeling increased as compared to a lux level of 300. Similarly a lux level of 300 produced greater dizzy feeling among the operators than at a level of 175. None the less, the operators averagely had no distinctive feelings at lower lux levels but as the level of lux was increased gradually hit can be seen the change this caused on a variable like the dizzy feeling, headache and feeling of numbness. The results of the analysis can be seen in the above table.

The following table is however a comprehensive analysis for the variables at different levels of lux. This analysis slightly differs from the above one given that in this analysis the different lux levels formed the split output definers for the purpose of sorting the analysis on the basis of each level of lux as compared to the aggregated level. At a lux level of 175, the mean of those who felt difficulties in seeing was 1.94 individuals. The total number of participants at this level was 16. The mean for those who had strange feelings in the eyes was 3.19. This clearly shows that it is greater than the mean for difficulties in seeing but less than those who felt numb (mean 4.19), dizzy (mean 5.12), and headache (mean 4.50).

At a lux level of 300, the mean for the difficulties in seeing reduces from 1.94 to 1.62. This shows that at level of 300 of lux the operators experienced less difficulties in seeing as compared to a level of 175 of lux. It is worth noting that the standard deviations at the various levels differ. At lux of 300 the standard deviation for those who had difficulties in seeing was 0.824 while it was 1.237 at 175 lux. This results from the different values of N (the proportions of the sample who participated at those levels differed). At 175 lux, the sample size was 16 while at 300 the sample was 24 thus reducing the level of variance.

When the lux level was increased to 500 for the same variable of difficulties in seeing, it can be seen that the mean of those who indicate a feeling of difficulties in seeing is 1.94 and the same value was recorded for the lux level of 175. A note of the difference in standard deviations should however shed more light in the gravity of the difficulties in seeing. At lux of 500 with a standard deviation of 0.824 it is evident that the level of difficulties could be higher than the 175 lux level even though the sample sizes were same. The deduction is that there is less standard error in the results at 500 than those at 175 even if the sample sizes are same and means equal.

With respect to strange feelings in the eyes, the mean of those who feel strange in the eyes is 3.19 with a standard deviation of 1.13. As the lux level is increased to 300 the mean slightly changes upward to 3.62 and the standard deviation also increases to 1.74. A further increase of the lux level to 500 leads to a reduced mean of 3.44 with a 1.031 standard deviation. On the other hand, the mean of those who experienced tired eyes at lux level of 175 was 2.38 with a standard deviation of 1.31. When the lux level is increased to 300, the mean increases to 3.17 but the standard deviation also increases tremendously to 2.180. This indicates that even though the mean number of operators who felt tired eyes increased, there was also a large variation in each operators reports about how they felt with respect to tired eyes. Their feelings had wide variability. That is to mean that the individual measures highly deviated from the mean and hence indicate a high level of standard error. With respect to feeling numb, a lux level of 300 produced the least average score for those who felt numb. Even the standard deviation at this level was lower than other levels. Thus while at 300 lux level those who felt numb had an average of 3.92 and a standard deviation of 1.558, the other lux levels had scores of means of 4.19 and 4.31 and standard deviations of 1.870 and 1.580 for lux levels of 175 and 500 respectively. Thus, it is evident that at lux levels of 300, the operators experienced the least feeling of numbness.

Then, the analysis also generated results for dizzy feeling and headache among the operators at the different lux levels. At a lux level of 175, the mean of those who felt dizzy was 5.12 and the standard deviation was 1.996. The mean reduced to 4.83 at 300 lux. The standard deviation at the same level was an increase over the one recorded at 175 lux. It was 2.057. When the lux level was increased to 500, the mean further reduced to 4.63 and the standard deviation slightly increased to 2.094. thus, the reduction in the mean is outweighed by the increase in standard deviation as the increase in standard deviation denotes an increase in the variability of individual feelings among the operators. With respect to headache, a mean of 4.50 is realized with a standard deviation of 1.414 at 175 lux level. The mean increases by a 0.29 point when the lux level is increased to 300. The measure of variability increases to 1.841. So, it is deducible that with respect to headache at the start of the experiment session, lower lux levels produced lower levels of headache.

The analyses for the variables were also done at the end of the operator shift. The table below offers the comprehensive results in a briefed version.  At the end of the operator session, looking at the mean of those who experienced difficulties in seeing at 175 lux, the mean is 4.75 and standard deviation is 1.291. This is a considerable increase in the number of those who have difficulties in seeing if we compare the end and start of the session. At the start, the mean was 1.94 and standard deviation was 1.237. The standard deviation shows no significant change and thus means that the individual scores were clustered around the mean. That is, most operators exhibited feelings of strain in vision. At a lux of 300 the mean at the end of the operator session is 3.29 with a standard deviation of 1.301. This is lower than that recorded at 175 lux level and almost double the level recorded at the start of the session. At lux 500, the mean again increases over that recorded at 300 lux though it is higher than that recorded at the beginning of the operator session.

Looking at those who had strange feelings in the eyes at both the start and at the end of the session at various level of lux, there is clear indication of an increase in the means if the results for the end of session are compared against those recorded at the beginning of the session. At lux level of 175 the mean is 5.38 with a standard deviation of 2.125 but these figures reduce considerably when the lux level is increased to 300. The standard deviation reduces to 1.204 from 1.740 observed at the beginning of the session. Thus at lux level of 300, the number of operators who report strange feeling in the eyes tend to concentrate around the mean hence implying that most of the operators have a reduced strange feeling compared to that observed at lux level of 175. As the lux level is increased to 500, there is a marginal change in the mean number of operators with strange feeling in the eyes. In contrast, the standard deviation reduces indicating less variability in the reports of the operators with respect to the strange feeling in the eyes.

The mean of those who feel tired in the eyes remain high and equal for lux levels of 175 and 500 where the means equal 5.81 in both cases. However, at 175 lux level, there is a high variability as seen from the high value of standard deviation.  The mean number of operators who exhibit tired eyes at lux 300 is not only lower but even the variability is lower than the other levels.

The trend for numbness is almost in a similar direction as that recorded for tired eyes. Lux levels of 175 and 500 record high means for numbness with high variability observed for level 175. Still, 300 lux level records the least feeling of numbness among the operators. The observations for those who feel dizzy and those who feel a headache also follow the same trend. Less means are recorded for 300 lux levels while higher means are observed at the other lux levels.

Based on the results of the analysis, there is evidence that at a lux level of 300, there were fewer reports of fatigue related constructs like eye strain, headache, dizzy feeling, numbness, and difficulties in seeing. For instance, looking at those who had strange feelings in the eyes at both the start and at the end of the session at various level of lux, there is clear indication of an increase in the means if the results for the end of session are compared against those recorded at the beginning of the session.

Ocular fatigue is characterized by difficulties in seeing, a dizzy feeling, numbness and headache. It is therefore evident that the optimum levels of lux are imperative to ensure that the operators do not experience these characteristics which indicate a presence of ocular fatigue. At lower lux levels, the eyes strain and thus the operators get fatigue which hampers their ability to operate with desired proficiency. In such a case where the operator feels dizzy, falling a sleep may easily lead to accidents.