Posted on 01/02/2011 7:47:39 AM PST by Libloather
Okay, I can see how that would work. One camera for flight info and the other eight would send 64 images - eight each. That makes sense.
"Straps" see how that would have to be written for it to make mathematical sense?
Yes I totally agree but once again look who wrote the article 2 each lefties | Ellen Nakashima and Craig Whitlock You really do not expect the Lame Street Media to ever do a bit of research do you???? I was trying to show that it is possible to have 65 pictures sent thru a sub mux to the ground station and then demuxed to make all pictures available for viewing
PAge 2 it also seems that the person who did the cut and paste left out a lot ofthe article:
The system, made up of nine video cameras mounted on a remotely piloted aircraft, can transmit live images to soldiers on the ground or to analysts tracking enemy movements. It can send up to 65 different images to different users; by contrast, Air Force drones today shoot video from a single camera over a “soda straw” area the size of a building or two.
With the new tool, analysts will no longer have to guess where to point the camera, said Maj. Gen. James O. Poss, the Air Force’s assistant deputy chief of staff for intelligence, surveillance and reconnaissance. “Gorgon Stare will be looking at a whole city, so there will be no way for the adversary to know what we’re looking at, and we can see everything.”
Questions persist, however, about whether the military has the capability to sift through huge quantities of imagery quickly enough to convey useful data to troops in the field.
Officials also acknowledge that Gorgon Stare is of limited value unless they can match it with improved human intelligence - eyewitness reports of who is doing what on the ground.
The Air Force is exponentially increasing surveillance across Afghanistan. The monthly number of unmanned and manned aircraft surveillance sorties has more than doubled since last January, and quadrupled since the beginning of 2009.
Indeed, officials say, they cannot keep pace with the demand.
The eyes in the sky never blink.
Me too.
But I've always enjoyed puzzles, and a short reflection makes me note the absence of the word "simultaneously." So obviously some (or all) cameras are multipurpose on command. The permutations easily can accomodate 65 "different" sets of data.
The one intriguing statement is the flexibility of the word "everything." We are not likely to learn details and specifics for years.
One can only hope that this new tool can moderate the risk to American troops struggling under criminal rules of engagement.
There’s an office in the Pentagon that hands out these names. They pick one word from colummn A and one from column B, hence “Have Blue”, etc.
Each camera has a compound lens of eight segments. User chooses combination of segments that cover area of interest?
63 + 2= 65
2 have single feeds
imo
From the article:
“Gorgon Stare is being tested now, and officials hope it will be fielded within two months. Each $17.5 million pod weighs 1,100 pounds and, because of its configuration, will not be mounted with weapons on Reaper aircraft, officials said. They envision it will have civilian applications, including securing borders and aiding in natural disasters. The Department of Homeland Security is exploring the technology’s potential, an industry official said.”
Scary technology, necessary in war, but unfortunately coming soon the skies over your town. Guess we’ll all have to start wearing stealth burkas.
In the War Games movie sequel, Professor Falkien shielded his car roof so they wouldn't be detected via heat signatures.
Thanks folks!
I thought there was some clever inside joke, like Have Blue = Got Sky.
Not sure why the WaPo reported this now. This is only the beginning. From a Sep 2009 article:
...The $15 million Gorgon Stare, as the Air Force has labeled the new sensor, initially will be slung underneath the MQ-9 Reaper UAV. Other vehicles such as the RQ-4 Global Hawk UAV and even manned aircraft could be fitted later.
Its intended to supplement the multispectral targeting sensor that the Reaper carries now to transmit full-motion video of target areas. Gorgon Stare, which operates in the day and at night, has a slower refresh rate, but it will allow users to pick targets of interest that the full-motion video sensor can then focus on to get a more complete idea of their nature.
The future DARPA system, called the Autonomous Real-time Ground Ubiquitous Surveillance Imaging System (ARGUSIS), takes the concept a step further. It uses a 1.8 gigapixel camera running at 15 frames per second to provide a 27 gigapixel/sec video image.
Thats one of three ARGUS-IS subsystems. The other two are an airborne processor that can handle more than 10 teraops (1,012 operations per second) and a ground processing subsystem that records and displays the information sent to it by the airborne processor.
The 65 independent video feeds are the first application that will be embedded into the airborne processor. DARPA has already planned for a second application that will provide for a real-time moving target indicator to enable users to track vehicles throughout the sensors entire field of view.
The ARGUS-IS will first be integrated into the A160 Hummingbird unmanned helicopter for flight testing and demonstrations, DARPA officials said...
http://defensesystems.com/articles/2009/09/02/c4isr-3-gorgon-stare.aspx?sc_lang=en
Autonomous Real-time Ground Ubiquitous Surveillance - Imaging System (ARGUS-IS)
The technical emphasis of the program is on the development of the three subsystems; a 1.8 Gigapixels video sensor, an airborne processing subsystem, and a ground processing subsystem; that will be integrated together to from ARGUS-IS. The 1.8 Gigapixel video sensor produces more than 27 Gigapixels per second running at a frame rate of 15 Hz. The airborne processing subsystem is modular and scalable providing more than 10 TeraOPS of processing. The Gigapixel sensor subsystem and airborne processing subsystem will be integrated into the A-160 Hummingbird, an unmanned air vehicle for flight testing and demonstrations. The ground processing subsystem will record and display information down linked from the airborne subsystem. The first application that will be embedded into the airborne processing subsystem is a video window capability. In this application, users from the ground will be able to select a minimum of 65 independent video windows throughout the field of view. The video windows running at the sensor frame rates will be down linked to the ground in real-time. Video windows can be utilized to automatically track multiple targets as well a providing improved situational awareness. A second application is to provide a real-time moving target indicator for vehicles throughout the entire field of view in real-time.
http://www.darpa.mil/i2o/programs/argus/argus_approach.asp
For those interested, the Insight Program is also very cool...
(Click the Insight PDFs on the below link for more info)
http://www.darpa.mil/i2o/solicit/solicit_open.asp
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.