Nature’s Fury: The Global Effects of Natural Disasters

By Kriss Gross

March 9, 2014

Long lines at gas stations and store shelves empty of essential items like water, non-perishables, batteries, ice, candles, and generators are sure signs of an impending hurricane for anyone living on the United States’ (US) eastern and southern coasts. Due to advances made in monitoring and forecasting modeling, hurricanes are one of the few weather phenomena that people are able to make preparations for.  Unlike hurricanes, Mother Nature does not always allow for lengthy preparations to be made, especially in cases of tornadoes, where people may only have a few minutes to get to a safe place.  Earthquakes give no warning and often times lead to tsunamis, especially the quakes that happen deep in the ocean.

Monitoring the Phenomena

The World Meteorological Organization (WMO) website shares impending storm  information on the foundation of advisories released by Regional Specialized Meteorological Centers (RSMCs), Tropical Cyclone Warning Centers (TCWCs), and official warnings issued by National Meteorological and Hydrological Services (NMHSs) for their specific countries and areas of origin. Media outlets compile the advisory and warning data when planning their news bulletins that will be released to viewers in the areas of concern (WMO, 2014). These weather centers share maps and tracking data received from satellites, unmanned aerial vehicles (UAVs), manned weather research aircraft and coastal and fixed ocean data buoys.

The technological advances of modern communication allow for people to have the chance to survive many of today’s natural disasters. Satellites and some unmanned aerial vehicles (UAV), like the National Aeronautics and Space Administration’s

(NASA) Global Hawk, carry a Hurricane Imaging Radiometer (HIRAD) developed by Georgia Tech Research Institute (GTRI) engineers that track and record data from storms. This information is used to continue research into the development of more sensitive tracking devices, which afford more accurate details about developing storm systems (Wallace & Toon, n.d.).

The WMO website allows users worldwide to see developing weather advisories and warnings. Another international site, http://weather.org/stormwatch.htm, lets users type in the area this wish to observe. The sidebar lets users choose from a variety of phenomena, from hurricanes, tornadoes, snow and tides, to earthquakes, tsunamis, floods, and fires. Weather.org also has a Farmer’s Almanac and a newly added Aurora forecast. The advantage of websites like this, that allow interaction from the user, is people can check weather in any area they may be wanting to travel to or they may have friends or family in a location being affected by bad weather.

US residents can obtain information from www.weather.org, and the National Weather Service at www.nws.noaa.gov/, www.wunderground.com, and many locally supervised sites, such as those connected with local television and radio. In Europe, websites like www.weatherpal.se/, www.meteoalarm.eu/, and www.wunderground.com/severe/europe.asp allow users to click on their country and region to view weather in any area of interest or concern. Animated icons show different alerts, like flooding from snowmelt, high winds, and rising sea levels. The websites offer the option to view the information in various languages as well.

Japan, Taiwan and Mexico have earthquake early warning systems and the US has various seismological networks; while not necessarily focused on early warning, the Advanced National Seismic System, includes approximately 100 seismic monitoring stations. The “Global Seismographic Network” (GSN) is a fixed, digital network of seismological and geophysical sensors of connected telecommunications networks. Together, the US Geological Survey, the National Science Foundation and the Incorporated Research Institutions for Seismology, this GSN allows for worldwide monitoring of the Earth, using as many as 150 modern seismic stations distributed globally (Bogue, 2012).

Geography

Hurricanes’ destructive forces affect landmasses along the Atlantic and eastern Pacific oceans, the Caribbean Sea, and the Gulf of Mexico. With winds exceeding 155 miles per hour (MPH), hurricanes are capable of causing cataclysmic damage to coastlines and several hundred miles inland. Tornadoes and microbursts are common phenomena that coincide with these horrific storms and bring further destruction in the form of flying debris and heavy rainfall, which often leads to flash flooding and land or mudslides, in areas away from the hurricane.  Organizations like the Federal Emergency Management Agency (FEMA) and the Department of Homeland Security (DHS) work together to educate citizens of the dire effects of these storms, not just in the US, but in neighboring oceanic communities as well.

Tornadoes, like the one that hit Joplin, MO in. In the late afternoon of May 22, 2011, an EF5 multiple-vortex tornado struck Joplin, Mo. Reaching a maximum width of over one mile and with winds peaking at 250 mph, the tornado destroyed or damaged virtually everything in a six-mile path.

The devastating tornado claimed 161 lives, making it one of the single deadliest U.S. twisters since 1953. The Joplin tornado was only the second EF5 tornado to strike Missouri since 1950. It was the seventh-deadliest tornado in U.S. history and the 27th-deadliest in world history.

Resources

To meet NOAA’s “commitment to create a Weather-Ready Nation”, where the US is capable of preparing and responding to situations that affect “the safety, health, environment, economy, and homeland security, NOAA’s Office of Weather and Air Quality financed $1.3 million for seven multi-year proposals in 2013; thus enabling scientists and partnering universities to swiftly and efficiently “transfer new technology, research results, and observational advances through NOAA’s Joint Hurricane Testbed (JHT) to operational hurricane forecasting”. John Cortinas, director of NOAA’s Office of Weather and Air Quality, managers of the U.S. Weather Research Program that fund JHT projects, stated, “These important projects will help improve the information and tools that NOAA forecasters and researchers use to forecast tropical cyclones that impact the U.S. population and economy” (Allen, 2013).

Political Impact

Administrations and policy makers have the arduous task of determining the when, where and the how much of recovery, relief and eventually rebuilding efforts after devastating storms have torn communities apart. In the US, during hurricane Sandy, the 2012 presidential election was drawing near, leaving opponents with the decision to continuing to campaign or attend to their communities. President Obama put campaigning aside only long enough to tour the battered coastal areas, declare states of emergency and authorize the release of disaster relief funds. The American Red Cross, FEMA and other organizations made their way to the various areas to set up aid and relief sites. On November 6, 2012, “Residents in some of the affected areas are allowed to vote in the presidential election via email or fax, and some states allow voters to vote at any polling station” (CNN, 2013).

Economic Impact

The ultimate damage from these storms is loss of life; compounding matters, damages from these most recent storms, like those in the past, have left people without homes, without power to homes that survived, with businesses ruined (resulting in joblessness), and vehicles damaged or destroyed.  Community infrastructures also felt the impact as businesses have had to make the decision as to whether or not they rebuild, relocate or both.

Superstorm Sandy

In the large urban areas, like New York City, public transportation was reduced due to flooding. A 2013 CNN report shared that New York’s Metropolitan Transportation Authority (MTA) estimates over “$5 billion dollars in losses: $4.75 billion in infrastructure damage and a further $246 million in lost revenue and increased operating costs.” “According to a report released by the National Hurricane Center, Sandy is expected to rank as the second-costliest tropical cyclone on record, after Hurricane Katrina of 2005, and will probably be the sixth-costliest cyclone when adjusting for inflation, population and wealth normalization factors” (CNN, 2013).  The arguments ensued, with a very publically speaking New York City Mayor Michael Bloomberg saying, “Superstorm Sandy cost the city and local businesses some $20 billion dollars” and Governor Andrew Cuomo stating in an NPR interview that, “The taxpayers of New York cannot shoulder this burden, and I don’t think it’s fair to ask them to shoulder this burden. This state and this region of the country have always been there to support other regions of the country when they needed help. Well, we need help today.” Cuomo made a pint about Congress’ allocation of billions of dollars spent to aid Florida and other Gulf Coast states after hurricanes like Katrina and Andrew.  Commentator Joel Rose also noted that New Jersey’s Governor Chris Christie stated that New Jersey’s storm damages were an estimated $29 million (Rose, 2012).

Katrina

“University of North Texas Professor Bernard Weinstein put the total economic loss from Katrina (August 23, 2005) to be as high as $250 billion”, as he also considers the economic impact of the disruption in gas production as well as the damages incurred from the storm. 19% of the US’s oil production was affected by Katrina as well as damage caused by a smaller hurricane, Rita. The combination of the two storms, Rita (September 26, 2005) and Katrina affected 19% of U.S. oil production by destroying 113 offshore oil and gas platforms, damaging 457 oil and gas pipelines, and spilling nearly as much oil as the Exxon Valdez (1989) oil disaster. This caused oil prices to increase by $3 a barrel, and gas prices to nearly reach $5 a gallon. To stop the escalation in gas prices, the U.S. government released oil from its stockpile in the Strategic Petroleum Reserves. The storm also decimated Louisiana’s sugar industry, with the American Sugar Cane League estimating a $500 million in lost annual crop value. This area of Louisiana is also home to 50 chemical plants, responsible for 25% of the nation’s chemical production along with of 12 Mississippi’s coastal casinos, accounting for $1.3 billion annually (Amadeo, 2012).

Preparedness

Being prepared for an impending natural disaster could mean the difference between life and death; while technology can help predict storms, like hurricanes, some phenomena does not give any warning. Tornadoes, often spawned from hurricanes, give little or no leeway, leaving only minutes to get to a safe place. Earthquakes, a naturally occurring phenomenon, happen with no warning; although, some are preceded by smaller tremors. Born from oceanic earthquakes, tsunamis add insult to injury, by quickly developing after the earth stops shaking.

Because of the lessons learned from previous disasters, regions that are in hurricane prone areas, build structures on stilts and composed of materials that can withstand the high winds and survive potential flooding.  Evacuation routes are in place along coastal areas and because of the available lead time can secure homes and businesses and in the case of mandatory evacuations, there is time to depart the area. Websites, like www.ready.gov, list preparedness guides, giving users the information and guidance needed to prepare (Ready.gov, 2014).

For residents, in what the US calls “Tornado Alley” many residents have built “safe rooms”. These rooms are generally in a basement, in the centermost past of the ground floor or on a concrete floor in their garage. Many Midwest homes use storm cellars (or fruit cellars) that were built decades ago (personal knowledge). Residents have learned to have a pre-established communication plan and emergency kit (Ready.gov, 2014).  Although tornado sirens are commonplace in regions like “Tornado Alley”, they are often not heard inside homes or businesses. Because of this, NOAA also recommends that all residents, especially those in tornado prone areas, should have a NOAA Weather Radio All Hazards (NWR).

“NWR numbers 1000 transmitters, covering all 50 states, adjacent coastal waters, Puerto Rico, the U.S. Virgin Islands, and the U.S. Pacific Territories. NWR requires a special radio receiver or scanner capable of picking up the signal” NWR broadcasts warnings and post-event information for all types of hazards: weather (e.g., tornadoes, floods), natural (e.g., earthquakes, forest fires and volcanic activity), technological (e.g., chemical releases, oil spills, nuclear power plant emergencies, etc.), and national emergencies (e.g., terrorist attacks). Working with other Federal agencies and the Federal Communications Commission’s (FCC) Emergency Alert System (EAS), NWR is an all-hazards radio network, making it the most comprehensive weather and emergency information available to the public” (NOAA,  2014).

Regions prone to earthquake have modified the way structures are built, with buildings being able to shift with the earth’s movement; thereby minimizing damage. Preparing their homes for possible earthquakes, residents can follow guidelines included at sites like www.ready.gov for information to make their homes safer. Unfortunately, for low-lying area, like those in many regions of Asia, population density has many residents living in communities that sit directly on fault lines, putting them in a direct path for disaster. For those who survive the initial earthquake, moving to higher ground to escape an ensuing tsunami may be their only means of survival (Ready.gov, 2014).

References

Allen, M. (2014) Research behind the high-resolution rapid refresh weather forecast model. National Oceanic and Atmospheric Administration (NOAA). Retrieved from http://research.noaa.gov/News/NewsArchive/LatestNews/TabId/684/ArtMID/1768/ArticleID/10458/NOAA%E2%80%99s-Newest-Weather-Model-Provides-Clearer-Faster-Forecast-of-Severe-Weather.aspx

Allen, M. (2013). NOAA invests $1.3 million with university and federal researchers for hurricane forecasting advances. National Oceanic and Atmospheric Administration (NOAA). Retrieved from http://research.noaa.gov/News/NewsArchive/LatestNews/TabId/684/ArtMID/1768/ArticleID/10253/NOAA-invests-13-million-with-university-and-federal-researchers-for-hurricane-forecasting-advances.aspx

Amadeo, K. (2012). How Much Did Hurricane Katrina Damage the U.S. Economy? About.com/US Economy.  Retrieved from http://useconomy.about.com/od/grossdomesticproduct/f/katrina_damage.htm

Bogue, R. (2012).  Monitoring and predicting natural hazards in the environment. Sensor Review, 32(1), pp. 4-11. Retrieved from http://search.proquest.com.libproxy.edmc.edu/docview/916982912

CNN. (2013). Hurricane Sandy fast facts. CNN Library. Retrieved from http://www.cnn.com/2013/07/13/world/americas/hurricane-sandy-fast-facts/

Federal Emergency Management. (2014). Hurricane Sandy Impact Analysis. Retrieved from http://fema.maps.arcgis.com/home/webmap/viewer.html?webmap=307dd522499d4a44a33d7296a5da5ea0

Folger, T. (2012). Tsunami science. National Geographic. Retrieved from http://ngm.nationalgeographic.com/2012/02/tsunami/folger-text

Knabb, R., Rhome, J., & Brown, D. (2006). Tropical Cyclone Report-Hurricane Katrina. National Hurricane Center. Retrieved from http://www.nhc.noaa.gov/pdf/TCR-AL122005_Katrina.pdf

Missouri Storm Aware. (2014a). What is a tornado? Tornado Facts and History. Retrieved from http://stormaware.mo.gov/tornado-facts-history/

Missouri Storm Aware. (2014b). What is Storm Aware? Preparing for a Tornado. Retrieved from http://stormaware.mo.gov/preparing-for-a-tornado/

National Weather Service. (2014). National Hurricane Center. National Centers for Environmental Prediction. Retrieved from http://www.nhc.noaa.gov/

National Oceanic and Atmospheric Administration (NOAA). (2014). NOAA Weather Radio All Hazards. Retrieved from http://www.nws.noaa.gov/nwr/

Ready.gov. (2013). Retrieved from  http://www.ready.gov/about-us

Rose, J. (2012). Sandy may be costliest hurricane to hit east coast. National Public Radio (NPR). Retrieved from http://www.npr.org/2012/11/26/165945325/sandy-may-be-costliest-hurricane-to-hit-east-coast

Science Channel. (2014) Top 10 Natural Disasters. Discovery Communications, LLC. Retrieved from http://www.sciencechannel.com/life-earth-science/10-natural-disasters.htm

Wallace, L. & Toon, J. (n.d.). Monitoring hurricanes: Georgia tech engineers assist NASA with instrument for remotely measuring storm intensity. Georgia Institute of Technology. Retrieved 3/6/2014 from http://gtri.gatech.edu/casestudy/gtri-hurricane-imaging-radiometer-HIRAD-NASA

World Meteorological Organization (WMO). (2014). Official observations/official warnings. Severe Weather Information Center. Retrieved from http://severe.worldweather.org/

 

Advertisements

Nuclear Medicine: We’ve Come a Long Way

By Kriss Gross

March 6, 2014

The days of doctors treating physical ailments with only the use of blood samples and microscopes are long gone, being since replaced or assisted with the use nuclear technology. Nuclear medicine has paved the way to finding the causes for cardiac diseases, cancer, bone problems and other internal ailments that a typical x-ray would not detect. There is now a variety of scans being done, with each having a specific amount of information that can be acquired. Depending on what a doctor suspects an ailment might be determines the type of testing that will be used.

What is Nuclear Medicine?

Like all living organisms, humans are made of biomolecules and are maintained by a kinetic balance called homeostasis. When this balance becomes irregular, due to disease or injury, the body’s molecular system can start to malfunction. Through the technology of nuclear medicine, physicians are able to explore these imbalances and determine the best avenue to begin the healing, when healing is a possibility (Mansi, Ciarmiello, & Cuccurullo, 2012).

Nuclear medicine is different from x-ray, ultrasound and other diagnostic testing in that it uses small amounts of radioactive materials (tracers) that are injected, swallowed or inhaled. The type of tracer used depends on the part of the body that the nuclear imaging devise will be studying. Nuclear medicine can aid in the determination of medical ailments by testing the function of specific organs, tissues, or bone by allowing the physician to visualize the presence of abnormalities due to changes in the appearance of the structure (Iagaru & Quon, 2014a).

Because of technological advances in hybrid imagery and the release of new radiopharmaceuticals (which do not use radioisotopes), nuclear medicine is experiencing continued growth in the United States (US). According to Stanford Medical School physicians, Iagura and Quon (2014a), “continued growth of the field will require cost-effectiveness data and evidence that nuclear medicine procedures affect patients’ outcomes. Nuclear medicine physicians and radiologists will need more training in anatomic and molecular imaging. New educational models are being developed to ensure that future physicians will be adequately prepared.”

Applications

Through these advances made in nuclear medicine, the imaging devises available to aid physicians in determining injury and illness have experienced great strides in enabling a more effective diagnosis of illness and injury. Positron emission tomography (PET), bone scintigraphy (bone scan), hybrid imaging such as magnetic resonance imaging (MRI), and white blood cell (WBC) scans are just a few of the technologies available today.

Positron emission tomography (PET)

The most notable application of nuclear imagery is in the cardiology field, with over 1,000 procedures per 100,000 people being performed in the US. The majority of these procedures are taking place in a hospital setting; however, the number of nuclear imaging clinics has seen a substantial rise (Delbeke & Segall, 2011). Positron emission tomography (PET) scans examine the body’s chemistry. Other common medical tests, such as MRI scans and computed tomography (CT) scans, only reveal structural aspects of the body. The advantage of PET scans is their ability to enhance the details about bodily functions. One PET procedure allows physicians to “gather images of function throughout the entire body, uncovering abnormalities that might otherwise go undetected” (Iagura & Quon, 2014a). Because PET scans are a biological imaging examination and disease is a biological process, PET scans are able to detect and stage most cancers sooner than they could be visualized with other common examinations. This early detection also allows physicians access to vital information concerning heart disease and neurological disorders, such as Alzheimer’s. PET scan’s non-invasive, accurate system allows physicians to determine whether a suspected abnormality is malignant or benign, which in turn saves many patients from the need to endure painful and expensive exploratory surgeries, which may not always detect the stage or detriment of a disease.  The accuracy of the PET scan aids in earlier detection and diagnosis, putting time on the side of the patient, which increases the chances that treatments will be successful.

While no special preparation is needed prior to a PET scan, some tests require fasting, the elimination of caffeine and a brief cessation of certain medications. Prior to the procedure, the patient is either injected with or given orally a small dose of a radioactive substance, a radiopharmaceutical or tracer, which locates in the specific areas to be tested. This substance emits energy (gamma rays) that are detected with a gamma imaging device, aided by a computer that produces images and measurements of the specified organs or tissue (Iagura & Quon, 2014a).

Bone Scintigraphy (Bone Scan)

Bone scintigraphy (bone scan) is the second most widely used application of nuclear imagery; although, these procedures only account for approximately 17% of nuclear imagery performed in the US (Delbeke & Segall, 2011). Skeletal scintigraphy, when performed correctly, has proven to be an effective method “in detecting anatomic and physiologic abnormalities of the musculoskeletal system”.  Different skeletal diseases or injuries, such as accidental and non-accidental trauma, arthritis, bone cancer, and congenital or developmental anomalies, reflect individualized patterns that are observable within the bone scan procedure; therefore, increasing the likelihood of early detection, diagnosis, and treatment (Greenspan, 2013).

Patients receiving a bone scan are asked to stay hydrated before and during testing and are given the smallest possible intravenous dose of a radiopharmaceutical (tracer), usually Technetium-99m or similarly effective compound. General dosing guidelines are followed with dosages for small children and adolescents being based on the patient’s weight. Prior to the scan, which takes place within 2-4 hours of the tracer’s administration, the patient is then asked to empty their bladder, to remove any visual inaccuracy of the scan’s imagery. If the bladder refills during testing the scans will be delayed; although, catheterization may be necessary to avoid interruptions (Greenspan, 2013).

Magnetic Resonance Imaging (MRI)

Unlike PET and Bone scans, MRI scans are noninvasive procedures that do not require radiation to acquire an internal image. MRI’s imaging machinery uses a large magnet and computer to create the internal body images, often referred to a “slices”. These slices display a limited number of body tissue layers at a time. These layers are then examined on the computers monitor, allowing physicians to detect and observe any internal abnormalities. MRI scans can take from 15 to 90 minutes to complete, with an average complete examination taking from 1.5 to 3 hours (Iagura & Quon, 2014b).

Closed MRI machines are large, hollow cylindrical tubes surrounded by a circular magnet. In preparation of an exam, patients receiving an MRI are asked to remove all jewelry, including piercings. Transdermal patches, such as nicotine, birth control, and nitroglycerin patches, (which contain trace amounts of metal) also require removal. Patients suffering from chronic pain or have difficulty lying still may be given mild sedative to facilitate an uninterrupted exam. Prior to any MRI exams, it is important for the patient to inform the physician of any metals that may be in the patient’s body. This includes artificial or prosthetic limbs or joints, bullets or shrapnel fragments, ear implants, pacemakers, IV ports, and any other accidental or intentional metals that might interfere with the exam or harm the patient (Iagura & Quon, 2014b).

White Blood Cell (WBC) Scans

To look for internal infection or inflammation a physician may order a white blood cell (WBC) scan, also known as Leukocyte scans. WBC scan is done to look for a hidden infection. It is particularly useful if your doctor suspects an infection or inflammation in the abdomen or bones, like those that may be experienced after a surgery. WBC scans are nuclear imaging scans that use radiopharmaceuticals (tracers) to look for infection or inflammation in the body. In a procedure referred to as tagging, blood is taken from a patient’s vein, the white blood cells are separated from the sample, mixed with a fractional amount of radioactive material (radioisotope, referred to as indium-111), then returned to the patient’s blood stream, 2-3 hours later, via an intravenous injection. The patient’s body undergoes the scan 6-24 hours later. The scanning machine, which resembles as x-ray device, detects the radiation emitted from the tagged white blood cells and a computer then displays the image created by radiated blood cells (Dugdale, 2012).

WBC scans take 1 to 2 hours to complete and usually take place in a hospital setting; however, outpatient clinics are also available. While there are no special necessary preparations, much like an MRI, patients are required to remove all jewelry, piercings, and other metal containing objects, including hearing aids and denture apparatus containing metal. Patients are asked to wear loose fitting clothing (without metal snaps or zippers) or don a hospital gown. Your physician will need to be told if during the previous month, you have undergone a gallium scan, are receiving dialysis, receive nutrition through an IV or steroid therapy, have hyperglycemia, or are taking long-term antibiotics; as patients may be asked to discontinue the use of antibiotics prior to the test. WBC scans are not recommended for women who are pregnant or if trying to become pregnant, birth control is recommended during the course of WBC procedures (Dugdale, 2012).

Radiopharmaceuticals

Radiopharmaceuticals involve small amounts of radioactive materials (tracers) that are injected, swallowed or inhaled, with the type of tracer used depending on the part of the body that the nuclear imaging devise will be studying.  Radiopharmaceuticals (not using radioisotopes), like Technetium-99m (Tc-99m), account for about 50,000 medical imaging procedures daily in the United States. Tc-99m is the most routinely used medical isotope today Tc-99m is derived from the parent isotope Mo-99, predominantly produced from the fission of uranium-235 in highly enriched uranium targets (HEU) in aging foreign reactors.  North America’s supply of Tc-99m was heavily disrupted after Canada’s Chalk River nuclear reactor experienced an outage several years ago (Ambrosiano, 2013).

In an effort to reduce supply interruptions and eliminate the “potential use in nuclear weapons, acts of nuclear terrorism, or other malevolent purposes” (White House, 2012), the Los Alamos National Laboratory announced that “for the first time, irradiated low-enriched uranium (LEU) fuel has been recycled and reused for molybdenum-99 (Mo-99) production, with virtually no losses in Mo-99 yields or uranium recovery”. This further demonstrates the feasibility of the separation process and the probability of environmentally, cost-friendly fuel recycling (Ambrosiano, 2013).

Advantages, Disadvantages, and Safety

Advantages

The obvious advantages of nuclear medicine are realized in the number of patients who are surviving cancer, managing Alzheimer’s and Parkinson’s, and overcoming serious bone injuries. Nuclear imaging has become an irreplaceable tool in determining the reduction or recurrence of cancers, making its use as important as any of the medications used in a patient’s treatment. Because only one scan is needed to obtain a full body representation, repeated testing is often unnecessary, proving these procedures to be more cost effective as well (Iagura and Quon, 2014).

Disadvantages

The medical disadvantages of nuclear imaging are more apparent with individual patients and the inability to apply the technology to all patients. Certain physical factors limit the use of MRI imaging when the patient has imbedded and internal metals, i.e. pacemakers, surgically implanted feeding tubes, pins, rods and other permanent metals. MRIs are also not recommended for pregnant patients prior to 3 months into pregnancy. Pregnancy is also a factor in potential use of PET and WBC scans as the possible dangers during pregnancy are yet to be determined (Iagura & Quon, 2014b).

Economically, nuclear imaging is expensive and many insurance companies limit its use without verifiable need is determines; thus leaving some patients with decreased levels of treatment or no treatment at all.  Another factor is due to limited access to reliable sources of the isotopes needed to perform the imaging. The US is addressing this issue with accelerated commercial projects to produce the molybdenum-99 isotope domestically, reducing the use of highly enriched uranium (HEU) and increasing the use of low-enriched uranium (LEU), like the advancements being made at the Los Alamos National Laboratory (White House, 2012 & Ambrosiano, 2013).

Safety

Nuclear imaging procedures are considered the safest, most prevalent imaging exams being used today. Patients receive radiopharmaceuticals in minimal doses that deliver the smallest amount possible to achieve the diagnostic information needed; often exposing the patient to less radiation than an x-ray (Greenspan, 2013). The scanning device does not produce any radiation and the radiation emitted from the radioisotopes is minimal; as the materials breaking down quite rapidly, all small traces of radioactivity have generally diminished in 1 or 2 days. There are no verifiable cases of injury due exposure to radioisotopes (Dugdale, 2012).    The education and training received by radiologists, technologists, and physicians requires responsible behavior that ensures the safety of the staff and patient alike. In order to produce the quality image required for diagnostic success an “as low as reasonably achievable” (ALARA) approach is maintained to ensure minimal dosages and exposure (Greenspan, 2013).

In Closing

Nuclear medicine and medical imaging has come a long way and regardless of the continuing hurdles the advancements already gained allow physicians of today and those of the future to pursue new avenues in prevention, diagnosis and healing of the many ailments that patients and physicians face together. PET, MRI, WCB, and improving radiopharmaceuticals are improving and saving lives every day. Future discoveries and continued research will aid in finding the causes for cardiac diseases, cancer, bone problems and other internal ailments that in the past could lead to continued illness and premature death. While there is still a long way to go, a future free from disease, illness and permanent injury is no longer so far away.

 

References

Ambrosiano, N. (2013). Domestic production of medical isotope Mo-99 moves a step closer. Los Alamos National Laboratory. Retrieved from http://www.lanl.gov/newsroom/news-releases/2013/May/05.13-domestic-production-of-medical-isotope-mo99.php

Delbeke, D. & Segall, G. (2011). Status of and trends in nuclear medicine in the United States. The Journal of Nuclear Medicine, 52. Issues and Controversies in Nuclear Medicine, pp. 24S-8S. Retrieved from http://search.proquest.com.libproxy.edmc.edu/docview/913590094

Delbeke, D., Royal, H., Frey, K., Graham, M., & Segall, G. (2012). SNMMI/ABNM joint position statement on optimizing training in nuclear medicine in the era of hybrid imaging. The Journal of Nuclear Medicine 53(9), pp. 5. Retrieved from http://search.proquest.com.libproxy.edmc.edu/docview/1041061503

Dugdale, D. (2012). WBC scan. U.S. National Library of Medicine. Retrieved from http://www.nlm.nih.gov/medlineplus/ency/article/003834.htm

Greenspan, B. (2013). Skeletal scintigraphy. ACR–SPR Practice Guideline for the Performance of Skeletal Scintigraphy (bone scan). Retrieved from http://www.acr.org/~/media/839771405B9A43F7AF2D2A9982D81083.pdf

Iagaru, A. & Quon, A. (2014a). Illuminating and treating diseases. Stanford School of Medicine. Retrieved from http://nuclearmedicine.stanford.edu/

Iagaru, A. & Quon, A. (2014b). Magnetic Resonance Imaging-MRI, Patient Prep Instructions. Stanford Medicine Imaging. Retrieved from http://stanfordhospital.org/clinicsmedServices/medicalServices/imaging/docs/MRI_Booklet.pdf

Mansi, L., Ciarmiello, A., & Cuccurullo, V. (2012). PET/MRI and the revolution of the third eye. European Journal of Nuclear Medicine and Molecular Imaging, 39(10), pp. 1519-24. Retrieved from http://search.proquest.com.libproxy.edmc.edu/docview/1073650386

White House. (2012). Fact sheet: Encouraging reliable supplies of molybdenum-99 produced without highly enriched uranium. Office of the Press Secretary. Retrieved from http://www.whitehouse.gov/the-press-office/2012/06/07/fact-sheet-encouraging-reliable-supplies-molybdenum-99-produced-without-

Hybrids: A Cleaner Way to Drive

Kriss Gross

February 28, 2014

Imagine for a moment, that overnight, several oil tankers that were headed to the U.S. were sunk by terrorists. In response to this news, the gas stations have long lines and the price at the pump has gone up a dollar from what it was yesterday. To make matters more unsettling, the weather service is notifying viewers that the tropical storm that was several hundred miles out to sea has shifted its direction, aimed for the east coast, and is turning into a hurricane. With the possibility of power outages, people are not only filling their vehicles, but gas cans for generators as well. The situation becomes increasingly tense as local governments ask that communities stick together and help their neighbors, as the impending fuel shortages will impede the National Guard’s ability to provide aid and security during and after the storm. While this scenario is fiction, it is not unrealistic and emphasizes the need for the U.S. to decrease its dependence on foreign oil and increase its forward motion toward more fuel efficient modes of transportation (Stein, 2013).

As of a 2012 report by the Department of Energy, the United States spends almost $1 billion a day to purchase oil, from other countries, which Americans use to power their cars, trucks, planes, trains, and ships. An additional $55 billion is spent annually on the effects from the emissions from these transportation modes, i.e. health and environmental damages. In response to this information “advances in electric vehicles, engine efficiency, and clean domestic fuels open up cost-effective opportunities to reduce our oil dependence, avoid pollution, and create jobs designing and manufacturing better cars, trucks, and petroleum alternatives” (U.S. Department of Energy (DOE), 2012a). In order to increase the U.S. consumers desire to consider purchasing a hybrid, manufacturers are addressing the issues of price, fuel economy, and overall sustainability.  Of course, as with anything that may affect consumer spending, the political aspect, both nationally and internationally, presents another aspect to be considered.

Building a Better Hybrid

Hybrid vehicles are more than just a car or that runs on battery power; hybrids embrace all the available technology that will ultimately reduce America’s consumer dependence on foreign oil to power our transportation industry. So what makes a car a hybrid? “There are three degrees of hybridization, such as mild, full, and depending on the hybrid, a different drivetrain” (Union of Concerned Scientists, 2013). To be considered a hybrid, the vehicle must meet the first three of five characteristics; idle-off capability, regenerative braking capacity, power assist, and engine downsizing are considered “mild” hybrids; when electric-only drive mode is added, it is considered a “full” hybrid. The final characteristic, extended battery-electric range makes the vehicle a “plug-in” hybrid.  In order to meet the goal of reducing foreign oil dependence and reducing environmental impact, the manufacturing industry is addressing one of the significant issues of hybrid vehicles, price; the designers of hybrid vehicles are addressing the issues of battery cost, electric drivetrains, structural weight, engine efficiency, and fuel (DOE, 2012a).

Batteries and Drivetrains

Better Batteries. In the past, America has fallen behind in the development of a better vehicle; however, since 2009 the DOE’s Office of Energy Efficiency and Renewable Energy (EERE) and U.S. auto makers are making strides to change this. One way is the manufacture of advanced vehicle batteries, an industry that has jumped from two factories in 2009 to 30 in 2012, allowing for the U.S. to be able to produce the number of batteries and components large enough to fulfill and support the production of one million plug-in hybrid and electric vehicles by 2015. Not only will this industry advancement put America in a good manufacturing stance, it also decreases unemployment by tens of thousands American workers (DOE, 2012a). While hybrid vehicles have been on the market for several years, their popularity was diminished because of the cost of the vehicle. EERE has worked to reduce that cost by reducing the cost of the battery system by more than 35% since 2008 and working to further reduce the cost by 70% by 2015 (DOE, 2012a).

Electric Drivetrains. The reduction of the cost of the battery components is only one dimension of the hybrid that is being addressed, as the drivetrain is also going electric. The drivetrain is the mechanics of powering the drive wheels and with hybrids there are three different options, series, parallel, and series/parallel; with the type of drivetrain used depending on the overall use of the vehicle. The series drivetrain has an independent electric motor to start the vehicle’s motion and a computer that determines whether the power to run the motor comes from the battery or the gasoline engine. Because the engines in series drivetrain vehicles are generally smaller than conventional engines and the battery cell is larger, these vehicles are better suited for the stop and go traffic of large urban areas and being considered for use in buses and other urban vehicles, such as taxis and limo services. Through the use of a computer and transmission the parallel drivetrain is the choice of most hybrid vehicles being manufactured for consumer use. This drivetrain operates from the energy supplied by both the gasoline engine and the battery cell (smaller than that used with a series drivetrain), and also uses the regenerative braking to recharge the battery. With the engine directly connected to the drive wheels, the inefficiency of converting mechanical power to electricity and back is eliminated, making these hybrids better suited to highway driving; finally, while more expensive, series/parallel drivetrains combine the best of both systems with a larger battery cell and a generator as well (Union of Concerned Scientists, 2013).

The combination of these technologies results in vehicles that are consuming less fuel and reducing environmental impact with the reduction of CO2 emissions. Since the passing of the “Clean Air Act” (1973) and its revision in 1990, the Environmental Protection Agency (EPA) is spearheading programs such as the “National Clean Fleets Partnership”; where corporate compliance is evident in major companies like UPS, FedEx, Pepsi, Schwan’s, and others, where they are upgrading their fleets to electric, hybrid, and alternate fuel vehicles and redesigning their routes to further reduce drive time and fuel consumption (DOE, 2012b).

Recyclability

As the concept, production, and utilization of hybrid vehicles becomes more mainstream, the flaws in production of these vehicles also become more apparent; especially in regards to the recycling of the battery. Every battery eventually comes to a point where it can no longer be recharged and must be replaced. In the case of batteries used in hybrid vehicles the issue has grown with the size of the battery. Because of the caustic nature of the materials used in building these batteries, they cannot be simply thrown away. Current regulatory policy has put the responsibility of recycling these batteries on the manufacturer. As seen with current hybrid models, such as the Toyota Prius and the Honda Civic, the manufacturer, Panasonic, reclaims the batteries and reuses the materials in the production of new batteries. However, other companies, who do not reuse the batteries’ components in the manufacture of new batteries, burn them. This practice results in another complicated and environmentally disastrous situation; one that requires strict regulations and standards to be applied and enforced (Lewis, Park, & Paolini, 2012, pg. 5).

Sustainability

            In keeping with the need to reduce oil dependency, the power needed to operate the factories that are manufacturing the hybrid batteries, drivetrains and vehicles, needs to come from sustainable sources. After all, it’s counter-productive to manufacture a product, with an end goal of reducing negative environmental impact, in a facility that is creating more pollution than will be reduced by its product. The solution is building factories that use energy systems that are sustainable, i.e. solar, wind or biomass and converting current operations to systems that apply combined heat and power (CHP)(DOE, 2012c). The same principle could be applied to the recharging stations that will need to be built for electric vehicles (EVs) to plug into to recharge (Stein, 2013, p.11).

Lighter Weight Materials

Magnesium alloys, high-strength steel, titanium, and carbon-fiber composites are the next step in developing a lighter weight vehicle. Research is determining that for every 10% of reduced vehicle weight there is a 7% gain in fuel efficiency. DOE has a goal to reduce overall car weight by 50% by 2015; thereby reducing overall fuel costs by $4,300 over the life of the vehicle (EERE, 2012). Because of reduced fuel demands, the U.S. will ultimately also reduce its dependence on foreign oil imports by 25%. Another continuing advantage of using these lighter weight materials would be a dramatic reduction in the need to mine for iron ore and other steel related materials; further reducing production costs. These lighter weight materials would be used, not only on the body of the vehicles, but the engine and other internal components as well (Schutte, 2012).

Politics

On a national and international level, the politics of “going green” is not uncharted territory. The implementation of the Clean Air Act (1973) and the ongoing concern of foreign oil dependence have made the future of hybrid vehicles inevitable; however, achieving these goals requires policy change, a more focused pursuit of battery fuel energy, and concentrated efforts to reduce overall cost of ownership of hybrids.

Domestic Policy.  Due in part to the high cost of hybrid ownership, advancements in battery production is needed to reduce these costs; thereby making hybrids more affordable for a broader section of consumers. While the number of companies, in the U.S. that are manufacturing hybrid batteries has risen, tax incentives and monetary awards to companies that further the advancement of battery technology, would go a long way in realizing a larger volume of consumer hybrid vehicle purchases. Although there were tax credits given from 2006 through 2009, they were limited due to the number of hybrid vehicles that were being imported, instead of being manufactured domestically. Further complications and causes for a less than stellar forward mobility is, with decades to establish a firm foothold, the oil industry has no desire to reduce its current dominance or relevancy and continues to lobby against interests that are trying to make substantial gains in battery fuel development and implementation (Lewis, Park, & Paolini, 2012, pg. 6).

Foreign Policy. The advantages of reducing U.S. dependency on foreign oil are numerous; most notably the tenuous relationship between the U.S. and China, concerning their combined dependence on oil produced in the Persian Gulf, specifically, Saudi Arabia. By reducing oil dependency, the U.S. could reasonably decrease its military presence; thereby cutting the cost of maintaining that presence and also removing the need to continue the provision of weapons, as has been done to keep the U.S. in favor with the Saudi Oil suppliers. Of course it is unreasonable to think that reducing foreign oil dependence means that U.S. military presence can be reduced to zero, the circumstances are not so simplistic; although, just as there remains a military presence in Germany, South Korea, and Japan, at least the volume of military personnel can be drastically reduced. By eliminating the need to protect sea trade routes, like the Strait of Malacca, 70-100 billion dollars could then be available to further the advancement of hybrid battery production and other areas of hybrid sustainability, i.e. the charging station infrastructure (Lewis, Park, & Paolini, 2012, pg. 6).

Economics

            “If the United States stopped using gasoline to power its automobiles, it would essentially become energy independent overnight” (Stein, 2013, pg. 6). Although the statement may have some truth to it, it is hardly plausible and more likely that complete energy independence will take several years to occur; with the biggest issue being the affordability of hybrids and EVs for the general public.  While the U.S. is making steady progress in economic recovery, the high number of Americans that are still unemployed and struggling to make ends meet also means there is a considerably high number of consumers who are not even thinking about hybrid or electronic vehicles, let alone considering a purchase. “For example, in 2009, there were 8.8 million families living below the poverty line. For an idea of what that measures, for a family of four made up of two adults and two children, the poverty line was $21,756.93” (Stein, 2013, pg. 16). That level of income is less than the outright purchase price of most hybrid vehicles available in today’s market.

Personal Choice

            While conducting my research on hybrid vehicles available today, making a purchasing decision is far from easy. I chose 5 hybrid models to research and while I am in no position to purchase a vehicle, I did my research under the assumption of a better financial picture. It is also important to understand that what may be important to one consumer when considering a vehicle purchase may be of no concern to another. When considering a hybrid purchase, I looked at what’s important to me and quite frankly I am not impressed with my options. First and foremost is the vehicle must be made in the USA. I really liked the Subaru hybrid model; however, after discovering it was neither designed nor manufactured in the U.S., I removed it from my list. Imagine my dismay when I discovered that the Ford Fusion is assembled in Mexico and even more disturbing, of the models I chose to research, not one was both designed and manufactured in the America. So setting my disappointment aside I continued my comparisons based on other personal criteria. (See page 12) Because I like to travel, when finances permit, I spend several hours in my car and comfort and ergonomics are essential. I am long-legged, so legroom is important, as well as cargo space for luggage and camera gear. So after basing my decision on best overall fuel economy and the amenities that I wanted, I chose the 2014 Toyota Avalon XLE (Kelley Blue Book, 2014). While it’s fun to dream, it will be some time before a vehicle like that finds its way into my driveway; although, maybe by then, Subaru will be built in the U.S.

 

 

References

Kelley Blue Book. (2014). Cars for sale. Retrieved from http://www.kbb.com/cars-for-sale/?tab=mkmd

Lewis, H., Park, H., & Paolini, M. (2012). Frontier battery development for hybrid vehicles. Chemistry Central Journal, 6(1). Retrieved from http://dx.doi.org.libproxy.edmc.edu/10.1186/1752-153X-6-S1-S2

Schutte, C. (2012). Lightweighting Materials. Vehicle Technologies Program. Retrieved from http://www1.eere.energy.gov/vehiclesandfuels/pdfs/merit_review_2012/plenary/vtpn04_lm_schutte_2012_o.pdf

Stein, F. (2013). Ending America’s energy insecurity: Why electric vehicles should drive the United States to energy independence. Homeland Security Affairs, 9(1). Retrieved from https://login.libproxy.edmc.edu/login?url=http://search.proquest.com.libproxy.edmc.edu/docview/1368766010

Union of Concerned Scientists. (2013). How hybrid cars and trucks work. Center for Science and Democracy. Retrieved from http://www.ucsusa.org/clean_vehicles/smart-transportation-solutions/advanced-vehicle-technologies/hybrid-cars/how-hybrids-work.html

U.S. Department of Energy. (2012a). Sustainable Transportation. Office of Energy Efficiency and Renewable Energy. Retrieved from http://www1.eere.energy.gov/office_eere/pdfs/55295.pdf

U.S. Department of Energy. (2012b). America’s clean, efficient fleets: An infographic. Retrieved from http://energy.gov/articles/americas-clean-efficient-fleets-infographic

U.S. Department of Energy. (2012c). Top 10 things you didn’t know about combined heat and power. Retrieved from http://energy.gov/articles/top-10-things-you-didn-t-know-about-combined-heat-and-power

Biotechnology: Changing the Face of Agriculture

 

Biotechnology: Changing the Face of Agriculture

Kriss Gross

Argosy University

February 13, 2014

 

Global concerns over feeding the masses, including those in underdeveloped countries, have brought the use of GMO food sources to the attention of anyone with access to the internet and other media sources. The proponents for the use of GMO foods promote the use of GMOs as they allow for larger, higher quality crop yields; thus enabling larger numbers of people, especially those in countries fighting malnutrition and starvation, to survive and thrive. However, opponents of GMO use are asking if they’re really the answer? Are the risks involved with the consumption of GMO foods worth it?

So what exactly are GMOs? “GMOs, or “genetically modified organisms,” are plants or animals that have been genetically engineered with DNA from bacteria, viruses or other plants and animals. These experimental combinations of genes from different species cannot occur in nature or in traditional crossbreeding” (Non-GMO Project, 2014). The purpose of genetic modification (GM) is to make the plant more resistant to drought, extreme cold, insects, disease, and salinity. GM is also used to enhance nutrition and flavor of a given plant (Oliver, 2012). Understanding what GMOs are, why are they necessary? Population growth. As of now, the world population is over 7.2 billion people and climbing, with the largest increases being seen in Africa (Worldometers, 2014). Because of this increasing growth, the demand for sustainable foods is rising as well; however, while the Earth’s population is increasing, her size is not. As populations continue to rise, there is also an acknowledgment that increasing planting acreage isn’t a plausible solution. So to address the issue of rising population and without increasing planting space, the only logical solution is to increase the yields in the existing planting space; GM crops address this issue.

(Insert) Genetically engineered crops have been a reality since 1996, when herbicide-tolerant (HT) crops were developed to resist the herbicides used to control weeds in 17 percent of U. S. soybean fields. Cotton and corn followed with the total number of GE crops reaching 90 percent in 2013. Insect resistant corn and cotton contain a soil born gene, bacterium Bt (Bacillus thuringiensis) and has been planted since 1996 and varieties of corn and cotton containing both the HT and Bt genes (stacked) have since been planted (Fernandez-Cornejo, 2013).

The genetic modification of agricultural crops can be easily traced back to the “father of the Green Revolution”, the “man who fed the world”, Norman Borlaug. Borlaug (1914-2009), during a humanitarian mission to Mexico, developed a wheat hybrid (dwarf variety) that could withstand bending and breaking, unlike that of longer stalks, producing larger amounts of grain; while being more disease resistant.  This new wheat variety was not only successful in Mexico, but in India and Pakistan as well. It was later planted in nations “in Central and South America, the near and Middle East, and Africa” (Borlaug, 2014). Borlaug’s predecessors, Van Montagu, Chilton and Fraley, “joint recipients of the 2013 World Food Prize for their research and achievements in agricultural biotechnology”, gained “pioneer” status for their molecular biology discoveries in the genetic engineering of plants; developing insect and disease resistant crops that are also tolerate to extreme climate variations, require fewer chemical fertilizers and improve the agricultural lives of the poorest farmers worldwide (ISU, 2013).

Genetically modified crops have been the topic of much controversy over the past several years, with the fear mongers relaying tales of rising rates of disease among livestock that is fed these GM modified grains; therefore contaminating the foods that people eat. People will eagerly jump on the non-GMO bandwagon out of fear. The problem with this thinking is it causes unnecessary halts of the continuing research of better crop yields. People are naturally fearful of concepts that they don’t understand and people like Jeffrey Smith, fear-mongering author of “Seeds of Deception” (Oliver, 2014) do well to promote these fears. Another oft heard opponent of GMOs, Mark Lynas, offered this statement in his presentation to the Oxford Farming Conference in January 2013.

“I want to start with some apologies. For the record, here and upfront, I apologize for having spent several years ripping up GM crops. I am also sorry that I helped to start the anti-GM movement back in the mid-1990s, and that I thereby assisted in demonizing an important technological option which can be used to benefit the environment.” “And this is the challenge that faces us today: we are going to have to feed 9.5 billion hopefully much less poor people by 2050 on about the same land area as we use today, using limited fertilizer, water and pesticides and in the context of a rapidly-changing climate.”

The need to have these controversial opponent’s retracted statements heard is imperative if research into improving crop output is to continue. As Lynas points out, 9.5 billion people is a large number of mouths to feed and voicing unfounded opinions about how our scientific community is attempting to address these needs does little to solve the problem (Oliver, 2012). The importance to change public perception is shared in a dialogue in the 2013 brief, “From Monologue to Stakeholder Engagement: The Evolution of Biotech Communication”, where the authors discuss “a need for openness and transparency with the publics on various issues and concerns about the technology including its social, economic, cultural, and institutional dimensions” (Navarro, Tome, & Gimutao, 2013).

To improve the perceptions of GMO crops worldwide, workshops are taking place in communities where successful farming practices are employing GM planting methods. One such farm in South Africa, Makhatini Flats, is operated by “small-scale” farmers, who are sharing their first time success with Bt cotton. Makhatini Flats’ semi-arid climate presented visiting Members of Parliament (MP) with the advantages of the GMO crop, despite the limited resources of the area (Navarro, Tome, & Gimutao, 2013, p. 18). In the Philippines, the primary production vegetable is the eggplant. Unfortunately, eggplant is highly susceptible to the fruit and shoot borer (FSB), which causes complete damage to the crop. Use of a Bt eggplant, if planted, proposes a 40 percent increase in sellable harvest; further increasing farmer income by 50 percent. Visitors get to see the previous hole-ridden eggplant variety compared to the new blemish-free vegetable. The further advantage of the workshops is to change the perceptions caused by opponents of GMO crops. Because those who make the decisions about the adoption of future GMO planting have been misinformed by the media and other non-GMO groups, it is imperative that they are given accurate information concerning the economic and environmental advantages the new GMO crops can provide for the area (Navarro, Tome, & Gimutao, 2013, p. 24-5).

Adopting a more open-minded stance in regard to GM crops could allow growers the option of a simpler more flexible planting and growing season. The GM crops allow for narrower crop rows, lower till rates, and fewer herbicide applications; as glyphosate, an herbicide, can be used later in the season, due to a broader application time frame, reducing the concern of crop injury. Glyphosate also has no restrictions due to carryover and no residual activity allowing for worry free crop rotation. All of these benefits add up to overall financial advantages (Fernandez-Cornejo, 2013). The overall misconceptions of GMO crops are due mostly to failure to communicate between the developers, companies, and the farming communities and the insistence of GMO opponents to release one-sided or false information. With higher yields, reduced pesticide and herbicide applications, reduced negative environmental impact, and reduced land clearance for planting acreage, the continuing research and positive outcomes of current GMO planting can hopefully dispel myths and at least allow for a second look at the overall gains to be made with a committed use of GE crops.

 

 

References

Antoniou, M., Robinson, C., & Fagan, J. (2012). GMO myths and truths: An evidence-based examination of the claims made for the safety and efficacy of genetically modified crops. Earth Open Source. Retrieved from http://www.nongmoproject.org/wp-content/uploads/2010/08/GMO_Myths_and_Truths_1.31.pdf

Armenakas, S. & Alexiades-Armenakas, M. (2013). Genetically-modified organisms in United States agriculture: mandate for food labeling. Food and Nutrition Sciences 4(8), pp. 807-811. Retrieved from https://login.libproxy.edmc.edu/login?url=http://search.proquest.com.libproxy.edmc.edu/docview/1420276813

Borlaug, N. (na). Dr. Norman E. Borlaug. Norman Borlaug Institute for International Agriculture, Texas A & M University.  Retrieved from http://borlaug.tamu.edu/about/dr-norman-e-borlaug/

Fernandez-Cornejo, J. (2013). Adoption of genetically engineered crops in the U.S.: recent trends in GE adoption. United States Department of Agriculture. Retrieved from http://www.ers.usda.gov/data-products/adoption-of-genetically-engineered-crops-in-the-us/recent-trends-in-ge-adoption.aspx#.UvjwtPldWSo

Iowa State University (ISU). (2014). Scientific discovery and the fight to end global hunger: Marc Van Montagu, Mary–Dell Chilton and Robert T. Fraley. Biographies. Retrieved from http://www.lectures.iastate.edu/lecture/30381

Lynas, M. (2013).  2013 Oxford Farming Conference. Retrieved from http://www.marklynas.org/2013/01/lecture-to-oxford-farming-conference-3-january-2013/#sthash.bAC8Xlgy.dpuf

Navarro, M., Tome, K., & Gimutao, K. (2013). From monologue to stakeholder engagement: The evolution of biotech communication. The International Service for the Acquisition of Agri-biotech Applications (ISAAA) 45.  Ithaca, NY. Retrieved from http://www.isaaa.org/resources/publications/briefs/45/download/isaaa-brief-45-2013.pdf

NonGMO Project (2014). GMO facts. Retrieved from http://www.nongmoproject.org/learn-more/

Oliver, R. (2012). Sick Bees – Part 18E: Colony Collapse Revisited – Genetically Modified Plants. Scientific Beekeeping. Retrieved from http://scientificbeekeeping.com/sick-bees-part-18e-colony-collapse-revisited-genetically-modified-plants/

Toft, K. (2012). GMOs and global justice: applying global justice theory to the case of genetically modified crops and food. Journal of Agricultural and Environmental Ethics 25(2), pp. 223-237. Retrieved from http://dx.doi.org.libproxy.edmc.edu/10.1007/s10806-010-9295-x

Worldometers. (2014). Current World Population. Worldometers: Real Time World Statistics. Retrieved from http://www.worldometers.info/world-population/

Nanotechnology: The View In to a Very Small World

Nanotechnology:

The View In to a Very Small World

“There’s Plenty of Room at the Bottom”, the name of a lecture given by physicist Richard Feynman, in December of 1958, where he stated, “It is a staggeringly small world that is below. In the year 2000, when they look back at this age, they will wonder why it was not until the year 1960 that anybody began seriously to move in this direction.” Further into Feynman’s lecture he puzzled over chemistry issues; of being able to “make an analysis of any complicated chemical substance”, saying that all one needed to do was look at it and observe where the atoms were. He then pointed out the problem, that the electron microscope, of the time, was 100 times too weak; later he challenged: “Is there no way to make the electron microscope more powerful?” More than 10 years later; during his quest into ultra-precision machining, Professor Norio Taniguchi named this science of all things small, nanotechnology. It would be another decade before individual atoms would actually be “seen”, with the newly developed scanning tunneling microscope in 1981; nanotechnology actually began.

Feynman also discussed computers and the fact that, at that time, computers filled rooms. He was proposing the possibilities of smaller computers, much smaller computers. While we haven’t gone to nano-size quite yet, the progress that has been made in that direction is quite remarkable. After all, are the smart phones of today not hand-held computers, so to speak? At the end of Feynman’s lecture (1958) he issued a challenge, “I hereby offer a $1000 to the first guy who can take the information on the page of a book and put it on an area 1/2500 smaller in linear size and in such a manner that it can be read by an electron microscope.” He then offered another $1000 prize to “the guy who makes an operating electric motor, a rotating electric motor which can be controlled from the outside and not counting the lead-in wires, is only 1/64 inch cube. I do not expect that such prizes will have to wait very long for claimants” (Feynman, 1992).

Over 50 years later, the small world of nanotechnology is no longer so small, with personal computers and smart phones the miniaturized world that Feynman was so fond of is now a reality. This reality has gone well beyond the electron microscope he complained as being weak and is seemingly antique in comparison to the scanning tunneling microscope (STM), the atomic force microscope (AFM), and the transmission electron microscope (TEM) (Smallwood, 2009). While the ethical questions as to its application remain tenuous, students at UC Berkeley have built a nano motor, another of Feynman’s musings. As for the $1000 challenges Feynman extended, the 1/64th scale motor was built in 1961, with Feynman handing the winner, William McClelland his prize. The motor weighed 250 micrograms with one millionth of a horsepower. It wouldn’t be until 1986, when Feynman’s first 1/2500 scale writing was completed by Tom Newman, with the first page of “A Tale of Two Cities” (Feynman, 1959).  Considering the amount of information that a 32 GB thumb-drive holds, even Newman must be shaking his head at the advances since his 1/2500th scale achievement.

So what does all this small stuff mean in terms of today? Nanotechnology has many applications, from high powered microscopes, miniature motors, and information storage, to everyday consumer use products, medical applications, environmental improvements, and viable energy options.  What may surprise many people is nanotechnology has already been on their hands and faces, in their mouths, and for many, is being ingested.  At the 4TH Central and Eastern Europe regional meeting on Strategic Approach to International Chemicals Management (SAICM) and United Nations Institute for Training and Research (UNITAR), (Lodz, Poland, 27-29 June 2011), the uses of nanotechnology were discussed. Many of those uses include health and beauty aids, like face cream, sun screens and dental resins. Nanotechnology is used to make hearing aids, contact lenses, body wash and shampoos, bandages, energy drinks, drug delivery patches, and man-made skin. An especially advanced bandage called “Nanosilver Wound Dressing” is registered with the EPA as a “Tox Category IV disinfectant, being used on Navy submarines, cruise liners, airplanes and medical facilities to treat burn victims (Soldatenko, 2011). The uses of disinfectant delivery methods such as this were studied in a report published in Environmental Health Perspectives (2010). The report concluded that the use of nanotechnologies, in instances of this manner show no recognizable ill effects; although, research is still ongoing (Cooney, 2010). Such applications were also discussed in a Mayo Clinic report about skin regeneration and researchers are discovering “advances in control-release systems, nanotopography, biomechanics, materials science, and stem cell biology will enable researchers to design increasingly sophisticated engineered skin grafts with the potential to treat acute or chronic wounds” (Wong, Gurtner, & Longaker, 2013).

Soldatenko (2011) points out environmental applications of nanotechnology in waste management programs, by the development of viable nanofiltration systems used in water and air purification, reducing pollution, removing excess salts, heavy metals, and bacteria. Gold nanoparticles in air filters are used to remove toxic organisms and bacteria from the air. In another instance, the oil industry has developed “MCM-41 (known also as “self-assembled monolayers on mesoporous supports,” SAMMS), with pore sizes in the range of 10-100 nanometers is used for the removal of ultrafine contaminants, and a nanoparticle-reinforced polymeric material can replace structural metallic components in automobiles and lead to a reduction of 1.5 billion liters of gasoline consumption over the life of one year’s production of vehicles, thereby reducing carbon dioxide emissions annually by more than 5 billion kilograms” (Soldatenko, 2011).

An area of grave importance is energy consumption and the world’s dependence on petroleum for fuel and power sources. To address these growing concerns, projects such as “Caltech and Berkeley’s Joint Center on Artificial Photosynthesis, the Solar H2 network based at Uppsala University, the Solar Fuels Initiative (SOFI) based at North Western University, and Dan Nocera’s work at MIT and Harvard” (Faunce, 2013) is continuing the pursuit of solar energy through nanotechnology.  Consumers rely on the “short-term benefits” of fossil fuels such as natural gas, coal, and oil and the companies that manufacture and process these fuels rely on subsidies to keep prices competitive; yet they continue to encroach on the forward movement of solar technology. Despite the efforts to delay the technology, advancements continue to prove that artificial photosynthesis provides an inexpensive source of hydrogen fuel, oxygen, carbon-dioxide absorption, and soil nutrition (Faunce, 2013).

Synthetic biology provides for substantial advancement within three segments of nanotechnology and photosynthesis: light capture, water splitting (catalysis), and carbon dioxide reduction. Light capture is the development of nanostructured materials (synthetic organisms) that absorb photons on a broader solar spectrum, increasing the available absorption rate of the surface area in comparison to current surface rates.  Catalysis: as a fundamental characteristic of photosynthesis, the protein recognized as photosystem II, splits water into hydrogen and oxygen and new research now focuses on creating artificial water splitting catalysts (presently aimed at manganese, nickel, cobalt, and doped iron-oxide) which have an extended time frame to be regenerated from materials that are inexpensive and easily attained. The final segment, carbon dioxide reduction, is a significant effort being undertaken by researchers trying to re-create photosynthesis’ ability to reduce atmospheric carbon dioxide; it is also one of the most important, considering the current levels of carbon dioxide in the atmosphere (Faunce, 2013).

Given the continuing advancements mentioned herein and those yet to be attained, a degree of caution must be maintained. While it is the hope that the future of nanotechnology and nanosciences is a path aimed at securing a safer, cleaner, and more viable future, it is necessary to be realistic in the fact that not all technologies will be used for the betterment of society as a whole. Power and greed are realities that, regardless of the era, are the nature of those who wish to corrupt instead of embrace the positive possibilities that these sciences can achieve. Worldwide regulation and standards will need to be applied and upheld; however, whether they are followed will always be at question. It will remain, as always, up to those who hold the future of this world in the utmost importance, to be sure that future endeavors be done with thoughtful foresight of that future.

 References

Cooney, C. (2010). Triclosan comes under scrutiny. Environmental Health Perspectives 118(6), A242. Retrieved from http://search.proquest.com.libproxy.edmc.edu/docview/499793054

Faunce, T. (2013). Powering the world with artificial photosynthesis. The Futurist, 47(3), 6-8. Retrieved from http://search.proquest.com.libproxy.edmc.edu/docview/1419400518

Feynman, R. (1992). There’s plenty of room at the bottom. In Journal of Microelectromechanical Systems, 1(1), 60-66. Reprinted from Miniturization. Horace D. Bilbert, Ed.  Retrieved from http://media.wiley.com/product_data/excerpt/53/07803108/0780310853.pdf

Smallwood, C. (2009). 50 years later, still plenty of room at the bottom. Quest: The Science of Sustainability. Retrieved from http://science.kqed.org/quest/2009/11/02/50-years-later-still-plenty-of-room-at-the-bottom/

Soldatenko, A. (2011). Current uses of nanotechnology. University of Strasbourg. Retrieved from   http://www.unitar.org/cwm/sites/unitar.org.cwm/files/Lodz%20Presentation%20(Current%20uses%20%20of%20nanotechnology).pdf

United States National Nanotechnology Initiative. (n.d.). What is nanotechnology? Nanotechnology 101.  Retrieved from http://www.nano.gov/nanotech-101/what/definition

Wong, V., Gurtner, G., & Longaker, M. (2013). Wound Healing: A Paradigm for Regeneration. Mayo Clinic Proceedings, 88(9), 1022-31. Retrieved from https://login.libproxy.edmc.edu/login?url=http://search.proquest.com.libproxy.edmc.edu/docview/1436263946

Total:  A 100/100

Hi Kriss,

Thank you for submitting your assignment this week. The papers will continue to get longer and more involved, especially with topics that may not be familiar to you. Nanotechnology is a huge area of science that explores the smallest particles of matter. You did a great job discussing the future applications of nanotechnology and providing three examples of real world applications currently in use or being developed. You also did a nice job citing sources and creating an original paper. Your work is well done and I continue to look forward to the next intriguing paper.