Telecommunication

Synchronous Transmission
Synchronous transmission of data does not use start and stop bit for sending data fragments where the sender and the receiver are clocked. Blocks of several characters are transmitted as a unit in what is called a frame. The header and the trailer bits are inserted at the end of a block.

Asynchronous Transmission
Asynchronous transmission of data uses both the start and the stop bits and supply 38.4kbps speed as this is a character by character transmission and each character has a start and stop bit and an extra bit known as parity bit which ensures, that no bit sent matches the start and the stop bit sequence.

Analog Signal
An analog signal is a continuous signal in form of a wave and suffers less attenuation and can be multiplexed to increase its band width.

The disadvantage of this signal is that its more proms to errors from noise and interference. Usually its slower than the digital signal.

Digital Signal
Digital signal is a series of on and off electronic pulses, with time signals that is discrete. This type is signal is normally faster than analogue signal but suffer greater signal attenuation.

XON and XOFF are used in data transmission as control characters using a particular code. The XOFF is normally used by receiving device to inform the transmitting devices to stop broadcasting.

Simplex Transmission
Simplex transmission of data is where data travels in one direction only, its usually unidirectional. Used in radio broadcast where there is a radio transmitter and radio receiver.

Duplex Transmission
Duplex transmission is divided into half simplex and full duplex where in half duplex, data is sent in both directions but one direction at a time like in the walkie-talkie. In full duplex data is sent in both directions simultaneously as in telephone conversation.

Serial Transmission
Serial transmission of data involves strings of bits that are transmitted one after the other in a sequence through a single transmission

Parallel Transmission
In parallel transmission of data every bit in a character is transmitted in its own channel. Other bits of a given character are transmitted simultaneously.


Baseband
Baseband is a transmission mode which only allows one single high speed signal to be carried at a time. A digital signal uses the complete bandwidth of a cable which constitutes a single channel and there is no frequency division thus signal travels over short distances.

Broadband
Broadband transmission mode accommodates a number of separate frequency bands. This system uses analog signaling which is unidirectional transmission with a range of frequencies. In this transmission data signals travels longer distances before being attenuated (Bates, 2002).

Serial Line Internet Protocol (SLIP)
This is a type of protocol that is used in communication between two telecommunication equipment previously configured to communicate between each other over modem or serial ports link. SLIP does not offer error detection and being used alone is not suitable over error prone dial up.

Point-to-Point Protocol (PPP)
This is a data link protocol used in setting up a link directly over two telecommunication nodes over synchronous and asynchronous line. PPP can provide compression, encryption transmission and authentication connection over physical networks like phone line, mobile telephone and fiber optic connection.

Hypertext Transfer Protocol (HTTP)
This is a protocol that provides a standard way for servers and web browsers to transmit and request files on the World Wide Web. HTTP is as well used in downloading files and the web page is normally transmitted without any encryption (Blank, 2004).

File Transfer Protocol (FTP)
This is a protocol for exchanging files over TCPIP network and used for transmitting multiple files. FTP is used in transferring data efficiently and more reliably and promotes file sharing.

Transmission Control Protocol (TCP)
This is a connection oriented protocol of the internet protocol suite that provides reliable transport delivery between computers as information is transmitted over the internet.  TCP controls the rate at which data is transmitted and any traffic congestion using applications such as file transfer and internet.

Internet Protocol (IP)
This is the protocol that is used in data communication from one computer to another over the internet. Each host has a unique address that distinguishes it from the others on the internet.

Oceanography

Compare and contrast the various layers of ocean waters, flora, and fauna.
Ocean waters have been divided into five main layers by scientist. Also known as zones, these layers start from the ocean surface and extend to the deepest ends where light cannot reach. Creatures that survive in the extreme depths of the sea are fascinating and bizarre and as we penetrate deeper into the ocean, the pressure increase and the temperature reduce at a high rate. Epipelagic zone is the first layer of the ocean and it extends for about 200 meters (656 feet) from the ocean surface. Its also called the sunlight zone because this is the region that has most visible light. In addition, due to light, heat is present and its responsible for various degrees of temperatures within this zone.

The next layer is referred to as the mesapolagic zone and extends from 200 meters to 1000 meters. Its also known as the midwater or the twilight zone. Only extremely faint light has the potential of penetrating into this zone and bioluminescent creatures with twinkling lights are visible here. Moreover, a wide range of fishes that are strange and bizarre are found in this region, such as the melacosteid family of fishes also called the loosejaws. The third layer is called the bathypelagic, midnight or the dark zone. It extends down to 4000 meters and the only visible light that can be seen in this zone is produced by the creatures found there. Immense water pressure of up to 5,820 pounds per square inch is present (Thurman, Trujillo, 2007). Despite the presence of high pressures, creatures such as sperm whales dive into this region to look for food. The creatures surviving here are either red or black due to lack of light.

The fourth layer extends to 6,000 meters and is called the abyssopelagic or the abyssal zone. Its the Greek name for no bottom, no light is present and the temperature of the water is close to the freezing point. Minimal creature survives in this region and those that do are invertebrates like tiny squids and the basket stars. This zone is home to three quarters of the ocean floor and the deepest placed fish found in this region was in Puerto Rico at a depth of 8372meters. The hedalpelagic zone is the final layer that extends to the extreme end of the sea. This includes canyons and deep water trenches. Marina trench is the deepest point at 10,911 meters with a temperature close to freezing point and pressure of eight tons per square inch. Despite the forbidding conditions, creatures such as tube worms and star fish do survive here.

Discuss costal processes such as shoreline erosion.
Coastal erosion is a process of wearing down of a shore of a body of water that is occupied with gravel, sand, or bigger fragments of rocks by an instant or gradual action. Its a natural process whereby woody debris and erode sediments remain in coastal waters that are shallow and can enable the shoreline to evolve naturally. However, negative impacts of coastal erosion range from increased segmentation. Shoreline erosion is caused by changes in climate, relative sea level, tides, frequency of tropical storm and sediments delivered from deep seas. These factors cause long term chronic erosion or short-termstorm induced erosion (Thurman, Trujillo, 2007).

Analyze the daily fluctuations of tides and know its importance to tidal communities
Winds and tides that occur daily have a minimal effect on the shoreline but tropical storm and hurricanes cause a severe impact where huge quantities of shoreface sand is driven along or away from the coast.  Hydraulic action occurs when waves suddenly compresses air that is present in a joint, thus closing it and making the area to crack. Wave pounding occurs when waves moving with huge force attack the shoreline. Abrasion, also known as corrasion (different from corrosion), occurs when waves move sealoads along the sea cliff. Its one of the fastest and most effective forms of eroding the shoreline. Limestone cliffs are severely damaged by this kind of erosion. Attrition occurs when particles transported by the waves collide with each other and break down making them lighter to wash away. The materials are consequently deposited as sand or shingles.

Tides are the daily fluctuations of the sea levels which occur as a result of earth rotation and gravitational forces caused by the sun and the moon. The shape of the bottom of the shore also influences the size of the tides. Two high tides and two low tides occur daily in most coastal regions. The tidal forces impact force and energy in different parts of the earth resulting to relative movement of the substance in the earth, atmosphere and the ocean. Tidal forces in the ocean usually generate tidal currents that are alternating making the sea surface to be gradually displaced (Thurman, Trujillo, 2007).

The ocean and the surrounding environment have significant effects on communities living in the coastal regions. Positive impacts that accrue tidal communities include tourism and recreation opportunities which improve the economies of these regions. Fishing, hunting, boating, crabbing, and skiing are some of the recreation activities that tidal communities accrue. Tides are also important for fishing and shipping companies. Navigators must schedule to dock when the tide is high because the water will be deepest especially when heading for shallow ports. The height of tides is also important to consider since harbors and river have a bar that is shallow at the entrance that may prevent entry for boas with huge draft at certain tidal heights.

Compare and contrast deep and superficial ocean currents
Superficial currents or surface circulation are waters that consist of about ten percent of the entire water in the ocean. These waters also occupy the top 400 meters of the ocean. On the other hand, deep water current or thermohaline circulations are waters that occupy ninety percent of the ocean. The movement of these waters in the ocean basin is caused by density and gravity. The difference in density is caused by diverse levels of temperatures and salinity levels. In addition, deep waters usually sink in to the deepest ends of the ocean basin where temperature is near freezing point and causes the increase of density. Another distinction is that that surface currents are primarily driven wind while deep sea currents are driven by density found in the deep waters.

Discuss causes and effect of waves
Wind is the major cause of waves in the ocean surface. This occurs when wind energy is transferred into the water causing friction of water molecules and air molecules. Powerful winds such as storms cause huge waves. Movement of waves is not horizontal and it neither represents a straight flow of water. Tsunamis which are also called tidal waves are not similar to surface waves and they are mainly caused by volcanic eruptions, underwater earthquakes or landslides. Waves can cause coastal flooding especially when there are storm surge. Strong waves such as tsunami can cause devastating effects such as destruction of property and loss of lives. A positive effect of waves is that they can be converted to electricity (Thurman, Trujillo, 2007).

Understand the origins of the oceans
The origin of oceans can be dated back to the formation of the earth close to 4.6 billion years ago when the planet was being formed by accumulation of small substances called planetesimals. Water found in the ocean is believed to have originated from condensation after the outgassing of vapor that came from the earth surface. Some of the water was delivered by the collision of comets. Some researchers argue that a major portion of oceanic water was supplied by the forceful bombardment that occurred in the solar system billions of years ago.

Oceans have often been discussed in relation to deuteriumhydrogen ratios (dh) of different water sources in the solar system. The major source of water in the ocean is believed to be carbonaceous chondrites (CCs) or correct mixing of solar nebula and comets. Due to the closeness of the DH ratio in CCs to that found in the earths ocean, the CCs origin of water is largely accepted by many scholars.

Discuss the history of oceanography and know key events of the study.
Oceanography can be defined as the study of the world underneath the ocean and air that is above. However, for over a hundred fifty years now, oceanography actually was recognized as one of the disciplines in formal scientific. It might be considered as one of those newest fields in science, though its background come from way back like thousands of years, the time when individuals began venturing in rafts from coastlines. The first explorers, oceanographers and navigators started paying their attention concerning the ocean in different ways. They were observing storms, waves, tides and current which used to carry rafts in different direction. Around 2,850 years ago is when philosophers from early age and naturalists began to try in making senses of water body that they got from the land (Gregory, 1999). Since people could only see ocean that did not have an end observed from the shoreline, they then believed that the entire world was flat. In late 1400s and also early 1500s, explorers of the sea were not kept together with Columbus and concluded that the world is actually not flat rather it is round- with surface of nearly  spheres full of oceans.

To be precise, modern oceanography started as a science field about 130 years before in the 19th century, after the launching of expeditions by British, Europeans and Americans in order to explore currents in ocean, seafloor and ocean life away from they coastlines. Challenger Expedition was first among the scientists in expedition in exploring seafloor and ocean in the world since the year 1872 to 1876.

Compare and contrast oceanic plates and identify the margins.
Normally, there are three kinds of boundaries of tectonic plate Convergent boundaries Divergent boundaries and Transform boundaries. As the movement of giant plates occurs, converging or diverging along borders, there comes out unleashing of tremendous energies that result in tremors transforming the surface of the earth.

Divergent boundaries
Here, there is creation of new crust while two plates or more pull each other away. As a result, oceans are formed and end up growing wider and plates pull apart and diverge. When this occurs on the surface of the land, there rises a separation or a rift and with long duration of time, that land mass breaks to pieces of land masses and water comes to fill the vacant space in between them.

Convergent Boundaries
Crust, here is normally recycled back and destroyed into the earths interior as a plate lies below the other. What results in called subdection Zones- volcanoes and mountains are often found when plate converges. Convergent boundaries are of three types Oceanic-Oceanic Oceanic-Continental and Continental-Continental Convergent- Oceanic-Continental Convergent occurs when there is pushing of oceanic plate and sub-ducting beneath continental plate, there results lifting of continental plate that has overridden and there is creation of mountain range. The sub-ducting plate at the deeper part breaks down into pieces that are smaller where these pieces are locked for much longer time before there is generation of earthquakes that are large. Oceanic-Oceanic Convergence on the other hand is an oceanic trench that is deep is formed caused by convergence of oceanic plates where one is sub ducted below the other. They also come as a result of undersea volcanoes (Gregory, 1999).

Continental-Continental Convergence basically occurs when there is head-on meeting of two continents but none gets subducted since continental rocks are light, and do not follow downward motion like colliding icebergs do. The crust, instead, buckles and gets pushed sideways or backwards.
Transform-Fault Boundaries This is where two plates horizontally slide along one another. Most of the transform faults are normally found on floor of the ocean. Active ridges spreading are caused by theses transform faults that produce plate margins that are ziz-zag, and generally are explained by earthquakes that are shallow (Gregory, 1999).    

The role of oceans earths climate
In past years, the earth has been believed that it has size that is infinite, has a capacity of absorbing emissions unlimitedly to oceans from pollutants and atmosphere. Also in years recently, many discussions have been made on desert expansion and earth warning which might have resulted due to global destruction in the environment. Anomalies like these are caused by heat upset balance matters arising from atmosphere interactions, ocean, etc. The earth that people live is normally caused by interactions among ocean and atmosphere among other causes.

The system of the ocean
The ocean capacity of heat is 1000 times as the size of the atmosphere. Heat is transported through air currents, energy is supplied to atmosphere, substances are dissolved creating reactions chemically, creatures are supported, thus plays a role dominant to help in stabilizing environment globally and life maintaining on the earth. Atmospheric system though on the earth is omnipresent, little atmosphere data is obtained above ocean, hence resulting into difficulty in analyzing phenomena meteorologically. The oceans role in acting as a source of heat and water is considered as predominant (Thurman, Trujillo, 2007).

IceSnow System.
Ice and snow normally occupy large areas in terms of the surface of the earth and therefore they have albedo that is high. There is large effect on flow of heat and water and they therefore influence earth climate and meteorology.

Alternative to Cell phone Towers

Since its introduction, a cell phone has become an integral part of our life.
Though it has enhanced communication and enabled us to stay connected, cell phone network infrastructure (cell phone towers), transmits information by radio frequencies, chronic exposure to such frequencies has hazardous effects on our health, especially if one is exposed to them for a long period.Variou studies have revealed that they are numerous dangers and health related problems particularly with people who live near this towers. A study by Dr. Bruce Hocking revealed that children living near phone towers had double the risk of childhood leukemia than those who lived at least seven miles away from those towers. Eye cancers, miscarriage, cardiac arrests, are other common health illnesses associated with people living near cell phone towers. (Zhang, Y. HU, H. 2007). The dangers associated with cell phone towers has raised complaints by various stakeholders hence enhanced the need for an alternative to cell phone towers that would enable cell phone use and mitigate these complaints. This paper evaluates distributed Antenna System (DAS), as an alternative technology to cell phone towers.

Distributed Antenna System (DAS), is an alternative technology that is visually much less intrusive to the neighbouring community.DAS is a network of spatially separated antenna nodes connected to a common source by a transport medium that provides wireless service within a geographical area. Its architectural design consists of small groups of low-powered antennas, which are mounted on top of the existing utility poles throughout the coverage area. It is a true alternative to cell phone towers in that it provides cell phone service providers with the ability to enhance their network, while adhering to communities aesthetic requirements. (Zhang, Y. HU, H. 2007)

The DAS technology has enabled cities reduce the proliferation of traditional cell phone towers that is instead of erecting additional sky climber towers and rooftop antenna sites, cell phone providers can use the small, low profile DAS equipment that are strategically placed on existing utility poles, streetlight poles and other discrete locations in the public right- of-way. The DAS technology is a viable alternative with many advantages over the cell phone tower, for instance since most antennas are mounted on top of standard utility poles, this relocated this antennas away from public centers like schools and churches protecting the neighbouring community from any danger associated with radio frequencies(Zhang, Y. Hu, H. 2007)

DAS technology is a win-win alternative, there is no need for cellular towers which are a health risk to the neighbouring community. Mobile phone service providers can migrate to this technology, which uses small non-obstructive antennas placed on existing utility poles away from the public vicinities. DAS antennas are connected by fibre to the base station hence improving cell phone wireless coverage as well as minimizing future cell phone tower construction by using advanced fibre optic technology and effectively implementing public safety services.

LANWAN SECURITY OF DATABASE ON A CLOUD COMPUTING ENVIRONMENT

Chapter 3 Research Methodology

3.1 Overview
This chapter provides research methodology that reveals methods the study collect data. The method of data collection was through primary research and secondary research. The paper collected data to investigate LANWAN security of database on a cloud computing environment. The data collection through primary research was employed through quantitative method, data collection through secondary research was employed through critical analysis of journals articles, research articles, and white papers related to the LAN WAN security of database on a cloud computing environment. To enhance research validity and research reliability, data collected were subjected to critical analysis in order to enhance authenticity of the study.

The methodology pertaining to the field of research was through survey method and secondary research.

3.1.1 Methodology pertaining to the field of research
The methodology pertaining to the study is to investigate LANWAN security of database in computing cloud environment. In most research studies, methods of data collection were through qualitative and quantitative methods or mixed methods. However, where there is paucity of secondary research or data through primary research are not available, a study could employ experimental research to obtain data. This could be through stimulation methods. Typically, choosing an appropriate research pertaining to the field of study depends on the research aims and objectives. (Eramus,2009).

In this study, the aims and objectives were used to choose an appropriate method of data collection.

3.1.2 Two main goals in presenting the methodology illustration
There are several goals to be achieved in the research methodology for the LANWAN security of database  in cloud computing environment. One of   the goals to be achieved is to collect data to enhance security system in LAN and WAN database in computing cloud environment. In the contemporary business environment, database security has become utmost important, and has become one of the highest priority of the business organizations. Typically, a business organization could lose IT assets that worth several millions of dollars if adequate security is not implemented for an organizational database.

Apart from achieving security goal for WAN and LAN in computing could environment, the other goal to be achieved from presenting the research methodology is to fill the gap that has occurred through the paucity of data on the LANWAN security of database in a cloud computing environment. As being discussed in the previous chapter, there is paucity of   academic sources on secondary research. Moreover, cloud computing is a recent Information systems technique, many organisations have not experienced the implementation of database in cloud computing. Thus, the goal of the methodology employed is to enhance research knowledge on LANWAN security of database in computing cloud environment. (Cleveland, 2009).

To choose an appropriate research method for LANWAN security of database in cloud environment, a researcher critically examined all methods of data collection and made a systemize decision on all research approaches for the field of study before choosing an appropriate method.

3.1.3Systemizes decisions about the methods that will be deployed
The method to be deployed in data collection is very important, and the decision to choose an  appropriate method of  data collection depend on the problem to be addressed in a research  the aims and objectives of  research and research questions to answer. Foster (1998) pointed out that a researcher need to consider a research strategy that could fit his research questions.

Before making decision on the method to adopt in the data collection, the paper compares and contrasts the two important methods for data collection. The qualitative and qualitative methods have been the major methods of data collection in many academic studies. To make decision on the method to be deployed for this paper. The study compares and contrasts qualitative and quantitative methods.

3.1.4 Compare and Contrast quantitative method and qualitative method.
Both qualitative and quantitative methods employ different techniques to collect and analyse data. While quantitative method collects data through survey, the qualitative method employs unstructured interview as a method of data collection.  The benefits and shortcomings of both approaches enhance choosing an appropriate method of data collection to adopt in this study. Key (1997) argued that qualitative technique provides in-depth and comprehensive information for a research study, and this helped in providing deeper understanding of a research undertaking.  Ismail and Zin (2009) also pointed out that the qualitative method was an effective method to study LANWAN network security devices in cloud environment.

Despite the advantages of qualitative method for this study, it is difficult to establish validity and reliability of a study with qualitative method. (Maxwell, 1996, Wholey,  Hatry,  Newcomer, 2004).
On the other hand, quantitative method has the advantage of producing the accuracy of the results because personal bias could be avoided with this technique. (Hope University, 2010).

The accuracy of data collected through quantitative technique could be employed through statistical techniques with the aid of software packages and computational algorithms. (University of Wales, 2010, Patton, 2002).

The shortcoming of quantitative method is that research often takes place in unnatural setting, and it may be difficult to apply the result in real world.

 However, the Network Instruments (2005) argued that both quantitative and qualitative could be employed to analyze the efficacy of the security devices in WAN and LAN environment. The White Paper presented by the Network Instruments revealed that the efficacy of security device known as Network Analyzer could be reviewed using both quantitative and qualitative techniques. The network analyzer was effective in security devices in WAN and LAN cloud computing environment because it helped in alerting the security administrator if other security devices have failed. (Motorola, 2010).
With the argument presented, the study decided to implement quantitative method to collect data. To make final decision on the appropriate method to adopt, this decision to choose qualitative technique to collect data was presented before the committee to ensure that a sound methodology was chosen within the field of research.

3.1.5 Allow Committee to ensure the sound methodology within the field of research
To enhance the integrity of research, the university policy mandates a researcher to present the methodology to implement for a research before a committee to allow scientific validity. It is essential to realize that using the methodology that is not in accordance with research aims and objectives is a waste of human effort, and waste of monetary values that a researcher has put in research. Thus, to avoid these mistakes, a committee systematically reviews the different research methodology put before it, and make ethical approval before. (Senate 2009).

Thus, to enhance ethical selection of an appropriate method to be adopted, the methodology reviewed and chosen were presented before the committee. After the consideration of the committee, a quantitative method was selected as an appropriate method for data collection for the study.
Logical organization of methodology is critical to enhance logical findings. The next section provides logical organization of methodology.

3.2 Logical Organization of Methodology
The logical arrangement of research methodology is very critical, and this enhances the research validity. Logical structure of a research provides the evidence to answer the research questions.
For the logical organization of methodology for this study, the identification of the sample population will come first. The identification of the sample population is very critical in the research methodology since data will be collected from the sample population. (Mama,2006).

Moreover, the questionnaires were designed and all the questionnaires were structured in a way to achieve research aim and objectives. The identification of medium to distribute the questionnaires to the sample population is also very critical. Since the study decided that all sample population should receive the questionnaires, the email was employed to distribute questionnaire to all sample population. The primary advantage of   employing email to distribute questionnaires is that email is fast and is cost effective.

Since the study employed different types of research for data collection, the different types of research conducted in this paper are presented in the next section.

3.2.1 Type of research and specific subtype.
The paper employed two different types of research for data collection. The primary research and secondary research were employed to collect data. The primary research adopted involved collecting data through quantitative technique, and the method employed for data collection in the primary research were through survey. The other type of research employed in this study was secondary research. The secondary research involved reviewed of scholarly journals articles, research articles, the white papers in computer security, and academic books. A primary advantage of secondary research is that data could be easily accessed through electronic database. Castle (2003) provided a key advantage of secondary research by pointing out that secondary research is its potential for resource savings and cost- effectiveness. (p3).

However, the context and access of the primary and secondary data are essential, and these were presented in the next section.

3.2.2 Context and access.
The context and access of primary research and secondary research are critical to this study. The context of secondary research connotes the research field investigated. The context of research revolved around LANWAN security of database in computing cloud environment. The secondary data was to satisfy the features of WAN and LAN. This include the characteristics of LAN and WAN. The context of secondary research also involved the essential features of database in LAN and WAN environment. Typically, the integration of LANWAN database in cloud computing environment is a new phenomenon in the computer information systems. The security devices to protect the LANWAN database in the computing cloud environment are also new phenomenon. Thus, the context of secondary research covers the database concepts, LANWAN concepts and security devices in the LANWAN database in the computing cloud environment. The method to access the secondary data were through electronic database, online library, and university library. The access from electronic databases was through Science Direct, ACM Digital library, the Institute of Electrical Electronic Engineering Xplore Digital library, and Wiley InterScience. One of the advantages of electronic database is that they contain large collection of scholarly journal articles, research articles, and reports on Information Technology. Moreover, electronic database contain large collection of computer and IT related research articles. (Pelmer,2010).

By accessing electronic database to access secondary data, this study was able to have access to large volume of data. To access relevant data, a researcher employed series of keywords to access data relevant to the study. The keywords were Local Area Network, Wide Area Network, Cloud Environment, Database security, Database security in LANWAN cloud computing environment. The essence of using these concepts to search data in the electronic database was to access data relevant to the study.

The context of primary data consists of the list of the questions employed to collect data from the respondents. There are 10 questions sent to the respondents and the respondents were asked to select an appropriate answer foe the listed   questions. The context of questionnaires sent to the respondents is presented in appendix 1.

The participants for the primary research were selected in systematic method. The next section identifies participant population and the method participants are selected.

3.2.3Participant population and their selection procedure
The participants for primary research were the IT security experts, database administrators, system designers, system engineers and other IT experts.  Essentially, the participant population was also drawn from top executives of medium and large organizations whose companies had experienced LANWAN database in cloud environment. Moreover, the participants were drawn from executive of several IT firms. To ensure that all the important participants participated in the survey, a researcher employed probability sampling for selection of participant. According to Trochim (2006) probability sampling involves random selection from the sample population. With probability sampling, all the sample population has equal chance of being chosen. A researcher employed Stratified Random Sampling as one of the major method of probability sampling. The stratified random sampling is an effective method of probability sampling because all the sample population has the equal chance of being selected. (University of  Nebraska, 2009, Geogian Southern University, (2009).

Typically, a researcher distributed 200 survey questions to the sample population with the intention of receiving 50 correct questionnaires that were error free and free of bias. A researcher divided all the population into ten groups and distributed 20 questionnaires to each group. With this system, all the target population had equal chances of being selected.

To ensure that the questionnaires were properly filled and well understood by the participants, a researcher employed appropriate instrument for data collection.

3.2.4 Instrument
The survey instrument is essential in primary research because this is used for data collection. There were 10 questions sent to the participants, and special criterion was employed in the survey questions. Typically, a researcher used structured questions to draw the questionnaire. In this type of structured questionnaires, a participant was only allowed to tick an answer supplied by a researcher. Example of the questionnaire sent to the participants is as follows

A Survey Question Strongly disagreeDisagreeAgreeStrongly AgreeFirewall is an appropriate method of security device in LANWAN database in cloud environment FORMCHECKBOX  FORMCHECKBOX  FORMCHECKBOX   FORMCHECKBOX
All questionnaires sent to the participants are found in appendix 1.
Data collection is very critical in research methodology, and this enhances research findings.

3.2.5Data collection
All the data were collected through email. As it was being pointed out in the previous section, the questionnaires were distributed through email to ensure that all participants received the questions, and the data were collected on time. Meanwhile, a researcher started collecting data after 7 days of distributing the questionnaires. Major part of the questionnaire sent to the participant was returned through emails. From 200 questionnaires sent to the participants, researcher was able to receive 155 questionnaires. To ensure that all 155 returned questionnaires were reliable and valid to generate research findings, the data were subjected to data analysis.

3.2.6 Data analysis
Data analysis is very critical in a research. To enhance research validity, it is essential to analyze both primary and secondary data in order to enhance their accuracy.

3.2.6.1 Analysis of primary data
From all questionnaires, the participants returned 155 questionnaires. Analysis of these 155 questionnaires revealed that 32 participants did not fill their questionnaires in proper manners. There were 23 respondents who left their questionnaires blank, and 28 respondents did not fill their questionnaires with the notes of not understanding the contents of the questionnaires. Further analysis of the data revealed that there was suspected bias in 21 questionnaires. According to Trochim and Donnely (2007), it is essential to remove all suspected bias from the data in order to enhance data reliability. The summary of the entire correct filled questionnaires are provided in Table 1, and Fig 1. The SSPS computer software was employed for data analysis to enhance data reliability.

 Table 1 Summary of analysis of Primary data
Total Questionnaire Distributed200Total Questionnaire  returned by participants155Questionnaires not properly filled by participants32Questionnaire returned Blank23Questionnaires returned blank because participants not having knowledge IT security28Suspected Bias21Questionnaires properly returned and properly  filled51
Fig 1 Summary of graphical illustration of  analysis of  primary data


3.2.6.2 Analysis of secondary data
There are several strategies a researcher employed to analyze secondary data.
First, a researcher employed unique keywords to search for data from electronic database. The unique keywords employed were database, cloud computing, WAN, LAN, and security devices in LANWAN database in computing cloud environment. The essential reason of using these unique keywords to search for secondary data was to have access to data relevant to the study. Moreover, all the secondary data reviewed for the study were uniquely relevant to the study.

4 Summary
This chapter provides research methodology that consists of method of data collection, and the identification of sample population. The paper revealed that primary research method and secondary research were employed for data collection. The quantitative method was employed as an appropriate method in collecting data. To enhance data validity and data reliability, the data was subjected to data analysis in order to generate appropriate research findings.

The main points of the article

ARTICLE 1

Title of the Article  Are Aquariums Getting Too Lifelike
Author of the Article  Todd Heisler
Date Published March 22, 2010

The main points of the article. Why is this issuediscoverytechnology important

This article is about how the harvesting of certain marine creatures that are harvested because of their uses to marine aquarium hobbyist could upset the balance of nature.  The article focuses on the authorized harvesting of certain species of crab and invertebrates that frequent coral reefs and are bought by hobbyists because of the ability of these creatures to clean the artificial fish tanks and maintain the balance within the mini ecosystem of the aquariums.  However, some scientists argue that harvesting these creatures can eventually lead to their extinction.  This is refuted by experienced harvesters who claim that the species that they harvest can be sustained and have the ability to reproduce at rapid rates resulting in their re-population even after they have been totally harvested.  Scientists also argue that research on these marine creatures that are harvested should precede the harvesting because of the thousands of species that are being harvested every year, only a few of these have been studied in depth.  The article concludes with the assertion that regardless of reasons overfishing should be given much thought because there is always the possibility that because of over harvesting, some of these species may no longer be around for future generations to enjoy.

An opinion on the scientific merit of the article. (good science, useful science, pseudoscience)

The importance of this article lies in the fact that it is environmentally relevant.  It states what could happen to the marine ecosystem on account of the issue being tackled.  The article reiterates the need for people to ensure that the environment is not being abused or over used because an upset in the balance of the ecosystem could have serious consequences.  This is serious science taken in the context of the existence of humanity because as humanity is a part of the circle that includes all living things, anything that happens to any creature in this circle could have devastating effects on humanity.

What you learned that you did not know before.

The matter of hobbyist collecting various species of shrimp, crab, and invertebrates in their tanks for the purpose of balancing the mini ecosystem is something new for me.  Before, I always thought that these creatures are only kept because of their aesthetic and visual appeal but it turns out that they actually serve a purpose more that just what I initially thought they were intended for.  The issue of licensed harvesting of these creatures is also new to me because I always thought that harvesting these creatures was prohibited by law.

How the article relates to topics that have been or will be covered in class.

This article can be taken under the topics on biology and more specifically marine biology because it deals with the wildlife that is found in the sea and their roles in the balance of the marine ecosystem.  It could also be included in lessons on zoology as it refers to creatures from certain species that naturally exist under water,

ARTICLE 2

Title of the Article  Water Found on Moon, Researchers Say NASA, via Reuters
Author of the Article  Kenneth Chang
Date Published November 13, 2009

The main points of the article. Why is this issuediscoverytechnology important

This article is about how a space mission known as the Lcross Mission, which consisted of a two segmented missile crashing on the surface of the moon the crater Cabeus apparently came across 26 gallons of water.  The missile which was used had an impacting head and a segment that was designed to measure the debris plume.  According to the scientists who observed this particular crash of the missile the impact threw back a plume that consisted of molecules that were an indication of the presence of water.  This discover is quite important to the future of the human race because of plans of inhabiting the moon in case the earth becomes inhabitable.  The discovery of water on the moon is vital to future plans of building space infrastructure on it.  Other than just this, the discovery of water on the moon as is apparently stated in this article might merit further manned moon missions.  Aside from just water being a source of sustenance for those who soon plan to inhabit the moon, scientist also stated that water can be broken down into its component molecules if it should be used as rocket fuel.  The article also explains that despite this seemingly breakthrough discovery the amount of water found is not enough to merit conclusive evidence of abundant water on the moon.

An opinion on the scientific merit of the article. (good science, useful science, pseudoscience)

This article is very important in that it offers brand new insights on something that was previously not known.  It can be used by other researchers as a starting point for other future researches on the moon.  This article presents very useful science because in the history of space exploration the moon has played a very important role and the discovery of water on its surface, no matter how minimal opens brand new avenues of exploration and study.  In itself, this article has the capacity to jump start innovative studies into the matter of lunar habitability.

What you learned that you did not know before.

Naturally, as this is a new discovery I did not know that there was water on the moon before.  Despite previous readings that have only assumed the presence of water on the moon, no study has yet been conducted to confirm this theory prior to this particular article.  Aside from just this I also learned that there are indeed continuing studies about the lunar surface and scientists are actually considering the possibility of one day living on the surface of the moon.

How the article relates to topics that have been or will be covered in class.

This article can be taken under the topics on astronomy and the solar system as it takes into consideration earths natural satellite, the moon.  It can also be tackled under scientific breakthroughs space exploration because it not only focuses on the moon itself but also, more importantly, on the more in the context of its role to the existence and sustainability of our own planet, the earth.

BHARTI AIRTEL INTRODUCES IPHONE 3GS IN INDIAN MARKET

According to Clendenin the Indian mobile industry is anticipated to triple in a few years from now thus becoming one of the fastest growing markets in the world. The Isupply Corporation on of the leading company in research and advisory in technology issues based in California revealed that the number of subscribers practically doubled to 149.5 million, increasing from 85 million in 2005.It states that about 5.5 million Indians are subscribing every month, which ought to raise the subscriber numbers to 484 million come 2011.

Jagdish Rebello a manager as well as a chief analyst for iSuppli says that, the coming of low cost phones, Tariff declines,  increase in per-capita earnings, friendly industry as well as consumer regulations authorized by the government along with a host of various other aspects have been crucial in stirring up this development (Clendenin).

According to PR log press release, with the biggest cellular service provider in India, Bharati Airtel, finally launching the much expected iPhone 3GS last month this great achievement is envisioned by many people in India as a great step ahead and will literally enhance the mobile industry far beyond most of the estimation done in previous years. It is accessible in two versions, which are the 16 GB as well as 32 GB and in Black and White colors. The iPhone 3GS has a long durable battery back up, a video recording as well as a 3 MP camera with auto-focus. It is being identified by many analysts as the fastest and the most powerful phone to ever be launched by Apple Company to the Indian market
Since the majority of phones in India are devices that are simple and voice-centric, iPhone 3GS is expected to be a huge alternative of many of them as it has great features for the consumer however as the phone grow in its popularity, the replacement pace ought to increase to 25 percent by 2011 (Clendenin).

Investigate the domino effect with a set of dominoes

The aim of this experiment is to investigate the relation of speed of domino effect with the distance between the dominoes.

I think that the speed of domino effect decreases as the distance between the dominoes increases. It is also assumed that the speed of the domino effect decreases linearly by increasing the distance between the dominoes. In other words we can say that the time taken by dominoes to completely fall is linearly dependent on the distance between the dominoes.

I think that the graph between the distance between the dominoes and the time taken by the dominoes to fall complete must be a straight line showing the direct proportionality relationship like shown below in fig 1.

Fig. 1 expected relation of distance between the dominoes and time taken by dominoes to fall completely.
Various variables involved and studied in current work are
Independent Variable the distance between the dominoes (inches)
Dependent Variable speed of the domino effect (seconds)
Constants the number of dominoes and the material of the dominoes
Materials List
Dominoes (46 black plastic dominos), Stop watch, Ruler, Large flat free surface

Diagram
Fig. 2 show pictures of dominoes standing and falling.

Fig. 2  Dominoes are at rest(left). Dominoes are falling (right).
Procedure
I prepared all 46 dominoes on a large, flat, free surface saperated by  inch each standing up in a straight line.

When all dominoes were set I held the stopwatch and started the stopwatch as I tipped the first domino.

When last domino fell I stopped the stopwatch and precisely noted the time taken.
I repeated the above three steps three times.

Then I changed the distance between dominoes to 1 inch and repeated the above four steps. I did the same for distance between the dominoes to be 1.5 inch this time and collected data as shown in table 1.

Table 1 Observed values of time taken by dominoes to fall for various distance between dominoes
Distance (inches)Time (sec) trial 1Time (sec) Trial 2Time (sec) Trial 3Average (seconds)1.11.11.11.112.32.32.32.31  3.53.53.53.5

Results and discussions
The graph of data collected is shown in fig. 3 and shows the linear behavior of distance between the dominoes and time taken to fall completely showing that the speed of domino effect decreases by increasing the distance between the dominoes.