The only way that I can see us maintaining individual liberty is to drastically reduce the number of laws and regulations now extant in the Nanny State. Otherwise, this sort of capability will make it possible for the powers that be to intimidate, harass, and punish anyone who opposes them. There are so many laws and regulations now available for potential abuse, that no one can be sure that they cannot be prosecuted. Most people like break the law, whether they intend to or not, several times a week.
One of those things that makes me go "Hmmmmm...?" How does that work mathematically? Sixty-five divided by nine is fractional. Some things I read make me scratch my head...
Ahem.
Nice videos!
Thanks for sharing! 8^)
Will the bad guys turn to stone? Any chance Jeff Lynne will be a remote pilot?
Seriously, though.....sounds great. I get a kick out of the names they give these projects. Have Blue is one of my all time favorites.
I guess we won't be using THAT on the Mexican boarder.
Will the bad guys turn to stone? Any chance Jeff Lynne will be a remote pilot?
Seriously, though.....sounds great. I get a kick out of the names they give these projects. Have Blue is one of my all time favorites.
The eyes in the sky never blink.
Scary technology, necessary in war, but unfortunately coming soon the skies over your town. Guess we’ll all have to start wearing stealth burkas.
Not sure why the WaPo reported this now. This is only the beginning. From a Sep 2009 article:
...The $15 million Gorgon Stare, as the Air Force has labeled the new sensor, initially will be slung underneath the MQ-9 Reaper UAV. Other vehicles such as the RQ-4 Global Hawk UAV and even manned aircraft could be fitted later.
Its intended to supplement the multispectral targeting sensor that the Reaper carries now to transmit full-motion video of target areas. Gorgon Stare, which operates in the day and at night, has a slower refresh rate, but it will allow users to pick targets of interest that the full-motion video sensor can then focus on to get a more complete idea of their nature.
The future DARPA system, called the Autonomous Real-time Ground Ubiquitous Surveillance Imaging System (ARGUSIS), takes the concept a step further. It uses a 1.8 gigapixel camera running at 15 frames per second to provide a 27 gigapixel/sec video image.
Thats one of three ARGUS-IS subsystems. The other two are an airborne processor that can handle more than 10 teraops (1,012 operations per second) and a ground processing subsystem that records and displays the information sent to it by the airborne processor.
The 65 independent video feeds are the first application that will be embedded into the airborne processor. DARPA has already planned for a second application that will provide for a real-time moving target indicator to enable users to track vehicles throughout the sensors entire field of view.
The ARGUS-IS will first be integrated into the A160 Hummingbird unmanned helicopter for flight testing and demonstrations, DARPA officials said...
http://defensesystems.com/articles/2009/09/02/c4isr-3-gorgon-stare.aspx?sc_lang=en
Autonomous Real-time Ground Ubiquitous Surveillance - Imaging System (ARGUS-IS)
The technical emphasis of the program is on the development of the three subsystems; a 1.8 Gigapixels video sensor, an airborne processing subsystem, and a ground processing subsystem; that will be integrated together to from ARGUS-IS. The 1.8 Gigapixel video sensor produces more than 27 Gigapixels per second running at a frame rate of 15 Hz. The airborne processing subsystem is modular and scalable providing more than 10 TeraOPS of processing. The Gigapixel sensor subsystem and airborne processing subsystem will be integrated into the A-160 Hummingbird, an unmanned air vehicle for flight testing and demonstrations. The ground processing subsystem will record and display information down linked from the airborne subsystem. The first application that will be embedded into the airborne processing subsystem is a video window capability. In this application, users from the ground will be able to select a minimum of 65 independent video windows throughout the field of view. The video windows running at the sensor frame rates will be down linked to the ground in real-time. Video windows can be utilized to automatically track multiple targets as well a providing improved situational awareness. A second application is to provide a real-time moving target indicator for vehicles throughout the entire field of view in real-time.
http://www.darpa.mil/i2o/programs/argus/argus_approach.asp
For those interested, the Insight Program is also very cool...
(Click the Insight PDFs on the below link for more info)
http://www.darpa.mil/i2o/solicit/solicit_open.asp