Physics

Cameras

Cameras are devices that capture and record images by focusing light onto a photosensitive surface. They operate based on the principles of optics, utilizing lenses to form an image of the scene being photographed. In physics, cameras are studied in relation to the behavior of light, image formation, and the properties of lenses and mirrors.

Written by Perlego with AI-assistance

7 Key excerpts on "Cameras"

Index pages curate the most relevant extracts from our library of academic textbooks. They’ve been created using an in-house natural language model (NLM), each adding context and meaning to key research topics.
  • Police Photography
    eBook - ePub
    • Larry Miller, Norman Marin(Authors)
    • 2014(Publication Date)
    • Routledge
      (Publisher)
    CHAPTER 4 Light Theory and Digital Imaging

    Contents

    Introduction
    Refraction
    Light
    Light—Our Most Important Tool
    Intensity of Light
    Digital Image Formation
    The Digital Sensor
    Image Resolution
    Sensor Size and Optics
    Color Reproduction
    Image File Formats
    Use and Selection of File Formats
    Reference
    ABSTRACT
    Light is the photographer's most important tool. Whether using traditional film or digital imaging Cameras, the proper use of light will ensure proper exposure. Digital Cameras are capable of recording light images over a wide range of the electromagnetic spectrum, from ultraviolet through infrared. Light is manipulated through the camera lens and recorded on film or a digital imaging sensor by the time and amount of light striking the light-sensitive material.
    Key Terms
    • Concave
    • Convex
    • Focal length
    • Focus
    • Infinity
    • Infrared light
    • Inverse square law
    • Lens
    • Luminous light
    • Reflection
    • Refraction
    • Shutter
    • Ultraviolet light
    • Visible light

    Introduction

    A photograph is the result of a manipulation, a recording—and in some cases, creation—of light. The entire science of photography is concerned with light manipulation and recording. The camera and its parts (shutter, diaphragm, focus) filter light, exposure meters record light, and flash units and lamps create light.
    Light is radiation. When an atom in a light source is changed physically (the cause of the physical change is irrelevant for our purposes), it emits a photon (electromagnetic radiation) that behaves like a wave and, at the same time, like a particle. For the purposes of photography, light may be discussed as photons that behave like waves.
    Light can be compared to a ripple over a pond. If a stone is thrown into a still pond, a series of concentric ripples can be seen that radiate from the source. The concentric ripples are formed because the energy from the disturbance is radiating in straight lines and in all directions over the surface of the pond. A cross section of the ripples in the pond would show that the energy moving across the pond has formed waves that are characterized by their peaks and troughs.
  • Applied Photographic Optics
    • Sidney Ray(Author)
    • 2002(Publication Date)
    • Routledge
      (Publisher)
    2 The role of the lens in photography

    2.1 Imaging

    The lens-based media of photography, cinematography, video and digital imaging in all their aspects make use of various radiation-sensitive systems in general and light-sensitive emulsions, surfaces, materials or photodetector arrays in particular.
    The lens of a camera (including a motion picture camera, video camera, or electronic and digital imaging systems) is the ‘front end’ of the system in that it acts as an interface between the scene or subject and the viewing or recording system. The prime function of the lens is to form a sharp optical image on the surface of the recording medium or display system. The image must have sufficient detail or resolution for the situation, and an adequate illuminance to permit a convenient exposure duration by the shutter or scanning system used. Some recording systems may dispense with imaging lenses as such, e.g. those used for holography, or assemble an image composed of pixels (picture elements) by sequential scanning of the subject or optical image as in thermography and video respectively, but associated optical components are usually needed.
    Viewfinder systems permit an image to be seen in real time for subject location, focusing and composition.

    2.2 Recording

    There are many ways of recording a subject, from a pencil drawing to storage in various digital formats on tape or disc. Of these, photography using a lens and film can still be one of the most cost-effective and convenient, and is potentially capable of having a very high information content.
    A highly corrected lens can give an image of the subject full of detail down to the natural limits set by the physical nature of light. A photographic record can be obtained from a brief exposure to this image followed by development, and may then be examined and evaluated at leisure. No other system seems to offer such a combination of convenience, sensitivity, unbiased recording, wealth of detail and image permanence for a factual representation of a complex scene.
  • Basics of Video Lighting
    • Des Lyver, Graham Swainson(Authors)
    • 2013(Publication Date)
    • Routledge
      (Publisher)

    3 The video camera

    Video Cameras are simply devices which can convert the images we see into electrical images that can be recorded onto magnetic tape. The two main ways of doing this are either by using cathode ray tubes (tube Cameras) or charge coupled devices (CCD Cameras). Tube Cameras are now almost exclusively redundant, although some countries will still have the old technology’ (Figure 3.1 ). The advancing micro electronics industry has given us lighter, smaller, better, more reliable and more sensitive imaging technology. As with computer technology these CCD devices are known as chips. Hence three-chip Cameras.
    Figure 3.1 Three-tube camera shown for comparison
    All Cameras have a lens, which helps to control and shape the image achieved. The lens will have at least two controls, focus and aperture (or iris).
    The focus allows for change of distance from the subject to the camera, to keep the significant part of the image clear. The aperture controls the amount of light getting into the camera and so is crucially connected with the effect of lighting. A variable size hole (called the iris) allows more or less light in. The numbers (f-numbers) on it are in a scale such that each step halves (or doubles) the amount of light let in. Perhaps confusingly the numbers of this scale get smaller the bigger the hole is, so, for instance, f4 lets in twice as much light as f5.6.
    Immediately behind the lens the image is broken into three separate colours – called primary colours – red, green and blue. The amounts of these are measured. The right amounts of red, green and blue will result in white. If the percentage of these colours is altered, the result will not be interpreted as white.
  • Understanding Forensic Digital Imaging
    • Herbert L. Blitzer, Karen Stein-Ferguson, Jeffrey Huang(Authors)
    • 2010(Publication Date)
    • Academic Press
      (Publisher)
    CHAPTER 3 Light and Lenses

    BASICS OF LIGHT

    The English word photography is most commonly understood to mean, “writing with light,” as derived from the Greek φωζ, “phos,” the root of the word for light; and Γραφω, “grapho,” the base of the verb, to write. So it is logical that to better understand the processes involved, the nature of light should be examined, as well as the device that allows it to enter the camera. After all, it is the light that is doing the writing.
    A good start to learning about light is to consider how it is generated and what it is made of. The place to start is the simple Bohr model of the atom, which is made up of a nucleus surrounded by a group of moving electrons. In this simple model, the electrons are in circular orbits around the nucleus similar to the way in which the planets circle the sun (see Figure 3.1 ). Newer models of the atom are more complex, but the basics of light generation are similar enough so that we can use the simple model. In the atom, the orbits have distinct energy levels. If energy is applied to an atom at the right level, it can cause an electron to move from a near orbit to a more distant orbit. But this is not necessarily a stable condition, and the electron will eventually fall from the high energy orbit to one at lower energy, often the one from which it came in the first place. As the atom goes from a higher energy state to a lower one, it must give up some energy. This is done by emitting light. Each such transition produces a packet of energy in the form of an electromagnetic wave packet called a photon. The photon is considered a particle of light. When a charged particle (in this case, an electron) moves rapidly, it results in the creation of an electromagnetic wave. This is what is happening in the atom. Electromagnetic waves do not need a medium in which to propagate (as water waves and sound waves do). The photon will travel in a straight line until it is either sent off course by a physical obstacle or absorbed. Photons are absorbed by the reverse of the process by which they are created. They impart their energy to an electron and send that electron to a higher energy orbit. The speed of the propagation (in a vacuum) is a universal constant, the speed of light, c. In materials other than a vacuum, the speed of light will be lower, and the ratio of c divided by the speed in the particular medium is called the refractive index.
  • A Laboratory Manual in Biophotonics
    • Vadim Backman, Adam Wax, Hao F. Zhang(Authors)
    • 2018(Publication Date)
    • CRC Press
      (Publisher)
    2   Optics Components and Electronic Equipment  
    The role optical instruments have played in the advent of the contemporary age is frequently underemphasized. While other modern marvels have taken the spotlight in recent years, the history of optics stakes a strong claim in the formation of the world we know. The telescope has been decisive in resolving the clash between geocentric and heliocentric conceptions, affirming the Copernican universe during Galileo’s time. The microscope broadened the horizons for biology and medicine, opening windows on an entire new world that was previously unobservable. The camera ushered forth a perspective on the world more vivid and more real than any painter’s brush could portray.
    Day to day, we find ourselves in continuous contact with optical instruments and their products. These optical instruments, along with their modern electronics, are the cornerstones to shape and observe the world we live in. To fathom the world of optics is to understand the fundamentals behind their operation. To explore the design and function of various sophisticated optical instruments is to comprehend the individual lenses, mirrors, and other optical devices that compose them.
    This chapter begins with an introduction to various commonplace optical devices at their simplest, namely lenses, mirrors, and objectives. While this section primarily relies on a diagrammatic description to suffice for illustrating fundamental physics concepts, several accompanying photographs of optics equipment are included for reference. In In addition to to manipulating the geometry and spatial organization of light, properties such as polarization and phase are tantamount to intensity and direction when describing optics. Introductions to waveplates and linear polarizers will offer descriptions of how such properties of light are controlled and adjusted.
    An important concept in optics deals with the transmission and efficient delivery of light. This topic primarily concerns optical fibers, their composition and fabrication, and their functions and importance in modern optics. The various mechanical equipment that supports the operation and calibration of sensitive optical arrays and apparatuses are termed “optomechanics” and is another topic covered in this chapter. These optomechanical devices come in various forms, including stages, mounts, and tables, each carefully manufactured to perform a specific function.
  • Langford's Starting Photography
    eBook - ePub

    Langford's Starting Photography

    The Guide to Creating Great Images

    • Philip Andrews(Author)
    • 2015(Publication Date)
    • Routledge
      (Publisher)
    page 46 .
    Unlike with film, there are no chemical steps involved in using your digital files to make prints. The camera is connected to your computer and all the digital photographs stored on your camera’s memory card are transferred into the memory of the computer. This process is called downloading. Once on the computer, the pictures can be displayed on screen, enhanced and edited using a software program called an image editor, such as Adobe Photoshop or Photoshop Elements. After all the picture changes have been made, the image is then printed using a desktop color printer or taken to a photo-laboratory for printing. Figure 2.5 shows the basic steps involved in producing a simple digital photograph.
    Figure 2.6
    The basic elements of a simple film and digital camera.

    The camera

    T
    here are so many Cameras you can buy that, to begin with, it is quite confusing. Remember though, every camera is basically just a light-tight box with a lens at one end and a light-sensitive surface (e.g. sensor or film) at the other. Film and digital Cameras vary a great deal in detail, but they all possess the basic features shown in Figure 2.6 in some form. These are, first and foremost, a lens positioned the correct focusing distance from the film/sensor; a shutter; a lens aperture; a viewfinder; a means of moving to the next picture or advancing the film; and an indicator to show how many pictures you have taken.
    Figure 2.7
    Most modern Cameras have automatic focusing systems built in, with some SLR models containing features that allow the user to switch between manual and auto-focus modes.
    The lens is the most important part of the whole camera. It must be protected from finger-marks and scratches, otherwise images resemble what you see when your eyes are watering. The spacing of the lens from the sensor/film has to change for subjects at different distances. Cheapest Cameras have the lens ‘focus free’, meaning it is fixed for what the makers regard as the subject distance for average snaps. Some have a ring or lever with a scale of distances (or symbols for ‘groups’, ‘portraits’, etc.). Operating this focusing control moves the lens slightly further from the film the nearer your subject distance setting. Most modern Cameras have lenses with an auto-focusing mechanism able to alter focusing to suit the distance of whatever the camera is pointing at in the central area of your picture (see Figure 2.7
  • Single-Camera Video Production
    • Robert B. Musburger, PhD, Michael R Ogden(Authors)
    • 2014(Publication Date)
    • Routledge
      (Publisher)
    From the perspective of a cinematographer, you must not get caught up in the hyper-bole of the latest and greatest in video camera technology marketing. Remember, a camera is a tool of storytelling. So, whatever tool best helps people have an emotional reaction to the story you are telling is the right tool to use. In the past, you had a limited selection of tools; now, you have many more camera options at your disposal. As a single-camera producer, you need to be well versed in using multiple tools— consumer, prosumer, DSLR, professional broadcast or even digital cinema and special purpose Cameras—and then choose the one, or more than one, that helps you best tell your story.

    Image Sensors and Optics

    An image sensor is a device that converts an optical image into an electronic signal. In older video Cameras, before the mid- to late 1980s, a vacuum camera tube or pickup tube was used to do the conversion. Several types of camera tubes were developed and used from the 1930s to the 1980s. Video camera tubes typically had a certain maximum brightness tolerance that, if exceeded, would result in burn-in or smear— among other known problems inherent to this technology. With the introduction of the camera “chip”—beginning with the charge-coupled device (CCD) and later the complementary metal–oxide–semiconductor (CMOS)—superior image sensors were finally available. Because the technology of the CCD and CMOS chips far surpassed that of camera tubes, camera manufacturers stopped making tube-based Cameras around 1990.

    CCD and CMOS Chips

    While most camcorders on the market have traditionally used the CCD chip, more and more are being introduced with CMOS chips. While they ostensibly do the same thing—transduce light into an electronic signal—they go about it in different ways and have unique properties that make each suitable for different types of production situations. Traditionally, “passive pixel sensor” CCD chips have been thought to produce better-looking images with less visual noise and distortion than CMOS “active pixel sensor” chips (meaning, each pixel has its own amplifier). In order to do this, CCDs also draw more power (up to 100 times more) and provide slower data-throughput speed (due to their passive pixel sensor design) than the CMOS image sensor. The benefits of CMOS sensors also extend to improvements in light sensitivity technology (e.g., backside-illumination) and noise-reduction techniques allowing them to become more efficient in low-light situations while narrowing the noise gap with CCD sensors. Operationally, however, CCD image sensors still have advantages related to global shutter mechanics over CMOS Cameras during video capture and until the CMOS rolling shutter system can scan and readout the image as fast as a global shutter CCD, this difference will remain. Finally, CCDs are typically more expensive to manufacturer than CMOS chips, causing many camera manufactures to transition from CCD sensors to CMOS sensors in their consumer and prosumer lines. Interestingly (or confusingly!), large format CMOS image sensors are also showing up in the very high-end digital cinema Cameras because of their image processing speeds and low-light sensitivity. Whereas, some aspects of how CCD and CMOS chips function (and their respective drawbacks) have been covered in Chapter 3