![]() |
| Orange County Register, 8-7-2024 |
The history of my Yellow Line invention begins with my education at Don Bosco Technical Institute where I took two years of physics and four years of print shop graphics. At USC I majored in physics which included optics and a lot of analytic geometry. After graduating from USC in 1968, I joined the Naval Ocean Systems Command (NOSC) to begin my career in science and engineering. The second day on the job, I was shown a red-screen light table simulating antisubmarine warfare where the positions of surface craft, aircraft, submarines, and Antisubmarine Weapons (ASW) were all driven by sonar and radar data. A couple of months later, one of the senior scientists asked if I could help debug one of the simulation's Fortran subroutines, one that corrected sonar data for the earth's curvature, up to 80 miles out. That same year, my boss asked me if I could figure out how to design, in principle, something called a "digital mixer". This turned out to be simple Boolean Algebra converter that can combine two video signals for display on a two-dimensional screen.
Finally, at NOSC, I was introduced to the Cable Controlled
Undersea Rescue Recovery Vehicle (CURV), whose undersea TV cameras were made
famous by the 1966 recovery of an atom bomb that fell into the Mediterranean
from a B52 air refueling accident. Five years later, at my PhD qualifying exam,
one of my advisors asked how I might go about designing a 4-quadrant star
tracking photo sensor for telescopes, like those in the high Andes that track
light images made billions of years ago.
![]() |
Texas Tech &
Tennessee players fight for the |
I started visualizing how a video camera could generate a
line drawn across the field on a two-dimensional TV screen. It occurred to me that the camera would need
to sit on a tripod that is instrumented to measure angles describing the
direction the camera is directed with respect to the field, namely pan, tilt,
and zoom (distance to the field). This
would easily work for a camera looking down the 50-yard line and drawing a line
anywhere near white sideline on the field.
But how does a first down line drawn on the 20-yard line appear to a
50-yard-line mounted camera in what be a highly oblique view. I wrestled with this in my mind's eye for a
few days until, driving home from the Coliseum, I had my Issac Newton Moment--with
the apple falling from the tree formulating the Law of Gravity. I realized you
need to use 3D spherical coordinates to depict a two-dimensional object in
oblique view.
About three weeks later I had a sketchy system diagram and written description showing how a first down line could be drawn on a live television screen. Two pieces of equipment are needed: A fully instrumented camera (later named the "Yellow Line Camera") in commercial use of my invention and a computer to calculate the necessary analytical geometries in merging geometric objects with live video images. The computer (later the "Linear Key") in commercial use of my invention.
I submitted my invention for expert review by two of my
colleagues at the Naval Ocean Systems Command.
One was expert in torpedo guidance and control (G&C) systems and the
other expert was responsible for the CURV program. With my colleagues' approval, I submitted my
TV Image Locator application to the US Navy Patent attorneys. Within a couple of weeks, and with a degree
of excitement, the attorneys got back to me saying my invention was one of the
best ideas they had seen.
In forming my patent application, the attorneys made 2 important contributions: (1) They selected the polar three-dimensional coordinate system, one that is used in astronomy, cartography and in "down-range" tracking systems, like sonar and radar. (The other coordinate system is used in particle physics.) Then, the attorneys worked out the complex analytical geometry equations and published them in the patent document. (2) The attorneys' second contribution was changing the name of my invention from "TV Object Locator" to "TV Object Locator and Image Identifier" a name that is easily understood in commercial in terms. The Application was submitted in July 1976.
For nearly two years I worried about my patent application thinking that my idea was simple and obvious, that some other inventor would beat me to patent approval. But, by early 1978 I was notified that my patent would be granted pending some legally required edits.![]() |
The patent document consists of pages of technical information, drawings
and claims. Most clarifying is the background statement found on Page 4 and
reprinted here. |
Two months of on and off discussions with ABC's producers lead to the conclusion that my invention wasn't ready for NFL Prime Time: “To superimpose graphics over a live play is taboo at ABC and probably in the broadcast industry. Have you ever been in an NFL sound truck during a live telecast? It is sheer chaos in there and we don’t need some computer brought in to add confusion", and "Too much cost, too much equipment, too many cameras, for something that may be only used 4 times a game". Although I got brushed off, I did learn from my discussions with ABC that my invention would be worth about $10,000 per game in advertiser royalties, an estimate that proved accurate 20 years later.
Others expressed interest, including the CBS Technology
Center which responded, after due consideration, that they would continue
funding their own "Action Track" technology slated for debut in the
next January's Super Bowl telecast.
"Action Track" did debut, unannounced, for 1 play (an extra
point kick) and crudely followed the ball in flight. But, the CBS announcers were unprepared and
confused by what happened on the screen, killing prospects for "Action
Track" along with my "TV Object Locator and Image Identifier"
for nearly 15 years.
I lost touch with the technological developments until 1996
when I learned a company named Princeton Video Image was working on an
invention that would cover the outfield walls of a baseball stadium with
billboards that would change dynamically over the course of a game. The
invention worked except for instances when outfielders made plays in front of
the billboards which had a smearing effect of the players outline covering the
billboard messages.
The same year another company named Sportvision had developed and introduced to television a technology which followed a hockey puck during a live telecast. "Fox Trax" (also called "Glow Puck") failed to gain fan acceptance because it produced a red smear with an indistinguishable arrowhead at its tip. But, the "Glow Puck" had one redeeming quality in that it called for insertion of an RF beacon into the hockey puck, a feature claimed in my original patent. Later, this became used in all manner of sporting events, inserting beacons into footballs, baseballs, golf balls, race cars and, finally, players and athletes--usually to measure speed and distance.
Undaunted, Sportvision introduced "First & 10", a commercial version of my invention that was limited to drawing first down lines that smeared when the line crossed over players or uneven perfections in the grass field itself.
![]() |
| The
"First & 10" camera was limited to instances where first down
line was centered near to the camera field of view and away from players and
officials. |
![]() |
"First & 10" camera was found useless in oblique views of
the first down line inside the 30-yard line. |
"Princeton Billboard," "Fox Trax," and "First & 10" failed because the inventions were limited to two-dimensional image processing in a three-dimensional world.
From the beginning, my 1976 invention was inspired by
various ideas I developed while working with the US Navy Ocean Systems Command.
In short the "TV Image Locator and Object Identifier" was conceived
as depiction of three-dimensional objects on 2-dimensional screens: TV, Sonar
Screen, or Radar. The three-dimensional image is calibrated by well-established
downrange targeting instrumentation like sonar and radar. The third dimension
can consist of several layers that separately contain lines, images, distance
to the target, or text above or below the object of interest, the prime example
being the football field or baseball outfield billboards.
By 1997, Princeton and Sporttvision filed new patents that
incorporated my 1978 invention by reference.
On September 27, 1978, Sporttvision announced it's Yellow Line at a
Cincinnati Bengals-Baltimore Ravens game on September 27, 1998. A few weeks later, on Thanksgiving Day,
in Princeton Video Image, having set
aside its "Billboard" aspirations, aired "Yellow Down Line"
to compete with Sporttvision. Both
companies went on to claim success and set precedence for further inventive
ideas inspired by my invention.
Many fans wonder how the Yellow Line works. Here, then is a description of how the
invention works as seen in an NFL sound truck.
![]() |
Note my "TV Image" marketing brochure in the foreground |
The yellow line occupies the monitor panel in the foreground. Three or more screens show a 100-yard football field templet that is mathematically generated in 5-yard increments. The Yellow Line template sits on a layer below the actual football field and the players on it. (The "First &10" templet sat directly on top the football field and the players.) The Yellow Line itself calibrated by the "Yellow Line Spotter," a broadcast technician whose full-time job is to maintain alignment of the Yellow Line on the otherwise invisible template throughout the game. This alignment process begins a day or two before the game and is updated an hour before the game and then throughout the game itself. Many factors influence the shape of the white screen mathematical model including irregularities like the field drainage crown in the center of a natural grass field (often two to three feet high) and any concrete stadium which can sink as much as 5 feet when fully occupied. Thus, Yellow Line TV Camera legs need to be sunk in deep concrete.
The success of my “TV Object Identifier and Image Locator”
immediately began to inspire other inventions in the sporting world and beyond.
Technologies like "Green Screen", "PITCH f/x", "Race
Car", "Golf F/x", and "CBS Telestrator", a screen
drawing system that John Madden couldn't handle. "Chroma Key &
Compositing" are Utility Apps that do all manner of screen editing like
feathering "Green Screen" edge fringe.
Yellow Line Apps make integration of digital information
(like labels and graphical elements) with the user environment a new
phenomenon, called "Augmented Reality". Highly illustrative are AR
Stargazing Apps such as Apple’s Night Sky. Here's how it works. The
geographical coordinates of the astronomer's cellular phone are known. When the
phone is pointed toward the heavens, the phone compares the star-filled image
to the built in map (or templet) of the night sky as seen from the specific latitude
and longitude of the phone’s location. The phone will pan and zoom its map to
fit the stars, constellations, galaxies, planets, comets, and meteors. It will
identify everything (visible and invisible) and draw links illustrating the
constellations as well.
Today there is an explosion in AR Apps. Soon, 3D facial maps that identify the person across from you will be small enough to mount in eyeglass frames.
![]() |
I am
reminded of my PhD qualifying exam, some 50 years ago, where my three advisors challenged
me to describe how a microelectronic star tracker might work. |














Comments
Post a Comment