The "Real Aerospace" Thread

Discussion regarding the Outsider webcomic, science, technology and science fiction.

Moderator: Outsider Moderators

gaerzi
Posts: 246
Joined: Thu Jan 30, 2020 5:14 pm

Re: The "Real Aerospace" Thread

Post by gaerzi »

Demarquis wrote:
Thu Jan 13, 2022 6:17 pm
That Doom playing bot appears to be an example of what I am describing: the bots play the game using the visual information from the screen (which is displaying the game Doom). They would pretty much have to do this, because, while I am not a programmer and do not have access to the developer notes for the game, there isn't any reason for the Doom software to be creating any data that isn't translated into the screen display. Why would it?
Of course there's a ton of data corresponding to things that aren't shown on the screen display! A monster that's behind a closed door still exists. It isn't generated at the moment you open the door.

Typically, game bots work by simply knowing stuff. They don't usually waste processing power on rendering a scene for the bot and then interpreting it, they just let the bots access directly the coordinates of everything. Basically, all bots use wallhacks. This particular example I shown doesn't do that because there the aim of the exercise is precisely to work on the visual interpretation AI side of things.


What I'm talking about, then, is that there's a question of perception. Even though it's machine vision with sensor data fusion for raw data from radar, IRST, optronics and whatever else, in the real world you don't get to have immediate access to perfect knowledge, like a bot can when it just reads data value from a simulation. Instead, we gotta work with the filter of perception. We get input, whether from our retinas or from a T/R module, and we gotta interpret it. And that input can be misleading in all kinds of ways. For example one of the problems they had to solve with sensor data fusion on the F-35 was duplication. One track gives you an aircraft at position X, another track gives you an aircraft at position Y, the two positions overlap, the unwanted outcome is two separate aircraft clipping through each other. You've got to deal with electronic warfare (jamming, spoofing, etc.) and stealth. You've got to deal with blind spots in your sensors. You've got to deal with a lot of problems like this because you can't read God's RAM.

QuakeIV
Posts: 210
Joined: Fri Jul 24, 2020 6:49 pm

Re: The "Real Aerospace" Thread

Post by QuakeIV »

Right, we are talking about more complicated stuff that has already more or less solved that problem.

e: To be clear I mean translating sensor measurements into coordinates and also to a degree controlling the aircraft based on limited sensor measurements that are not omnipotent. You can imagine that that could have possibly gotten further along than some guy making a tech demo that is trying to play doom.

Demarquis
Posts: 437
Joined: Mon Aug 16, 2021 9:03 pm

Re: The "Real Aerospace" Thread

Post by Demarquis »

"Of course there's a ton of data corresponding to things that aren't shown on the screen display! A monster that's behind a closed door still exists. It isn't generated at the moment you open the door."

So you're saying that certain data was transmitted to the AI that was not made available to the human? How do you know that? I couldn't find a mention in the article I cited.

"They don't usually waste processing power on rendering a scene for the bot and then interpreting it, they just let the bots access directly the coordinates of everything. Basically, all bots use wallhacks."

Can you provide a link for this? What is a "bot" in this context? Who uses them?

Even if true, I would think that an AI would have an even greater advantage in RL than in a sim due to this, being able to process more data inputs than a human Mk.I brain. If the fighters sensor data is taken as equivalent to the data contained in a game, then the ability to dispense with the HUD display, if that's a thing, would I think be helpful.

Mk_C
Posts: 198
Joined: Sun Jul 26, 2020 11:35 am

Re: The "Real Aerospace" Thread

Post by Mk_C »

Demarquis wrote:
Thu Jan 20, 2022 5:19 pm
Even if true, I would think that an AI would have an even greater advantage in RL than in a sim due to this, being able to process more data inputs than a human Mk.I brain. If the fighters sensor data is taken as equivalent to the data contained in a game, then the ability to dispense with the HUD display, if that's a thing, would I think be helpful.
Data processing capacity literally never was a bottleneck in this case.

Demarquis
Posts: 437
Joined: Mon Aug 16, 2021 9:03 pm

Re: The "Real Aerospace" Thread

Post by Demarquis »

Data processing capacity is always a factor when it comes to human brains.

User avatar
Cthulhu
Posts: 910
Joined: Sat Dec 01, 2012 6:15 pm

Re: The "Real Aerospace" Thread

Post by Cthulhu »

Demarquis wrote:
Tue Jan 25, 2022 4:06 pm
Data processing capacity is always a factor when it comes to human brains.
The problem is not the capacity itself, but reliably assigning it to a task for its entire duration. Oh, look, shiny... :o

QuakeIV
Posts: 210
Joined: Fri Jul 24, 2020 6:49 pm

Re: The "Real Aerospace" Thread

Post by QuakeIV »

Cthulhu wrote:
Tue Jan 25, 2022 6:20 pm
Demarquis wrote:
Tue Jan 25, 2022 4:06 pm
Data processing capacity is always a factor when it comes to human brains.
The problem is not the capacity itself, but reliably assigning it to a task for its entire duration. Oh, look, shiny... :o
also, muh target fixation

Krulle
Posts: 1414
Joined: Wed May 20, 2015 9:14 am

Re: The "Real Aerospace" Thread

Post by Krulle »

Did I miss it?
Nobody mentioned, that James Webb Space Telescope has reached L2?

https://blogs.nasa.gov/webb/2022/01/24/ ... ves-at-l2/

Cooling down as planned.
https://jwst.nasa.gov/content/webbLaunc ... sWebb.html has the "live" telemetry, and temperatures.
Vote for Outsider on TWC: Image
charred steppes, borders of territories: page 59,
jump-map of local stars: page 121, larger map in Loroi: page 118,
System view Leido Crossroads: page 123, after the battle page 195

Demarquis
Posts: 437
Joined: Mon Aug 16, 2021 9:03 pm

Re: The "Real Aerospace" Thread

Post by Demarquis »

How long until we start getting pictures?

User avatar
peragrin
Posts: 24
Joined: Sat Jun 07, 2014 1:51 pm

Re: The "Real Aerospace" Thread

Post by peragrin »

Demarquis wrote:
Thu Jan 20, 2022 5:19 pm
"Of course there's a ton of data corresponding to things that aren't shown on the screen display! A monster that's behind a closed door still exists. It isn't generated at the moment you open the door."

So you're saying that certain data was transmitted to the AI that was not made available to the human? How do you know that? I couldn't find a mention in the article I cited.

"They don't usually waste processing power on rendering a scene for the bot and then interpreting it, they just let the bots access directly the coordinates of everything. Basically, all bots use wallhacks."

Can you provide a link for this? What is a "bot" in this context? Who uses them?

Even if true, I would think that an AI would have an even greater advantage in RL than in a sim due to this, being able to process more data inputs than a human Mk.I brain. If the fighters sensor data is taken as equivalent to the data contained in a game, then the ability to dispense with the HUD display, if that's a thing, would I think be helpful.
All most all AI today use this method. StarCraft is most famous for its hard AIs but when those AI's are limited to human reaction times of inputs and identifying objects that are visible those same killer AI sudden shows just how stupid and limited it is.

https://arstechnica.com/gaming/2019/01/ ... air-fight/
Most AI's need help or some kind of advantage that humans aren't allowed like direct access to the feed to see all stats at the same time.

User avatar
Arioch
Site Admin
Posts: 4497
Joined: Sat Mar 05, 2011 4:19 am
Location: San Jose, CA
Contact:

Re: The "Real Aerospace" Thread

Post by Arioch »

Demarquis wrote:
Wed Jan 26, 2022 10:38 pm
How long until we start getting pictures?
I think it's going to be going through calibration procedures for something like 5 months.

Krulle
Posts: 1414
Joined: Wed May 20, 2015 9:14 am

Re: The "Real Aerospace" Thread

Post by Krulle »

Demarquis wrote:
Wed Jan 26, 2022 10:38 pm
How long until we start getting pictures?
"Webb" is going through calibration (app. 5 months, 3 months of which are minimum for the optics).
Part of the calibration is making images of sky segments which are well documented, e.g. through previous pictures by Hubble or similar.
By the end of the month February the first pictures will have been analysed by NASA, and by end of March the first one should be able to be released, with comparison what previous telescopes showed compared to the new JWST.

But the next three months are mainly adjusting the optics, by comparing images with known images. Again and again.
We might get some releases, mainly to highlight how much more Webb can see than Hubble, or ground-based telescopes.
but don't expect much. As the calibration process is intricate, and changing one variable means through the stiffness of the system that you change a lot others as well.
https://blogs.nasa.gov/webb/2022/01/19/webb-mirror-segment-deployments-complete/ wrote: “Next up in the wavefront process, we will be moving mirrors in the micron and nanometer ranges to reach the final optical positions for an aligned telescope. The process of telescope alignment will take approximately three months.”

—Erin Wolf, James Webb Space Telescope Program Manager, Ball Aerospace
First "new" segments of the space will not be captured before summer.
Until then, only higher definition of known segments of space.
Vote for Outsider on TWC: Image
charred steppes, borders of territories: page 59,
jump-map of local stars: page 121, larger map in Loroi: page 118,
System view Leido Crossroads: page 123, after the battle page 195

Demarquis
Posts: 437
Joined: Mon Aug 16, 2021 9:03 pm

Re: The "Real Aerospace" Thread

Post by Demarquis »

I note with interest that a Space X Falcon 9 rocket stage will impact the moon soon: https://www.space.com/spacex-falcon-9-r ... march-2022

"The rocket in question launched the Deep Space Climate Observatory (DSCOVR), a joint effort of the U.S. National Oceanic and Atmospheric Administration and NASA. DSCOVR studies our planet and the space weather environment from the Earth-sun Lagrange Point 1 (L1), a gravitationally stable spot about 930,000 miles (1.5 million kilometers) from Earth in the sunward direction. (NASA's $10 billion James Webb Space Telescope just arrived at L2, which is 930,000 miles from Earth in the other direction, toward the orbit of Mars.)"

Krulle
Posts: 1414
Joined: Wed May 20, 2015 9:14 am

Re: The "Real Aerospace" Thread

Post by Krulle »

First image!
A kind of selfie, as the image shows the image target star, and its reflection in 18 mirrors of JWebbTelescope.
https://twitter.com/NASA/status/1492166471769808916 wrote:@NASAWebb’s Near Infrared Camera, or NIRCam, recently became operational and captured this “selfie” of the telescope’s primary mirror. Webb will continue aligning its 18 mirrors over the next three months before it can #UnfoldtheUniverse: https://blogs.nasa.gov/webb/2022/02/11/ ... -18-times/
Image
The image of star HD 84406 is fuzzy, but hey, the main intention is the alignment of the mirrors!
Vote for Outsider on TWC: Image
charred steppes, borders of territories: page 59,
jump-map of local stars: page 121, larger map in Loroi: page 118,
System view Leido Crossroads: page 123, after the battle page 195

Demarquis
Posts: 437
Joined: Mon Aug 16, 2021 9:03 pm

Re: The "Real Aerospace" Thread

Post by Demarquis »


Post Reply