Jump to content

Recommended Posts

Posted (edited)
On 3/4/2019 at 4:34 AM, sketchley said:

I'm still in the process of finishing up the PHP and adding some additional translations for it's release on my site: http://sdfyodogawa.mywebcommunity.org/OTvfmf/OTvfmf.php

In VF-1 Space Wings there is an unofficial account of an unmanned VF-1 using low power acceleration for a long time to maximize endurance going to Saturn and back to the Moon. That makes me wonder if QFs are really disposable or if they shut almost completely when hiting Bingo Fuel and then coast to a predefined recovery orbit, like an interplanetary ionic engine probe of shorts.

Edited by Aries Turner
Grammar.
Posted
22 minutes ago, Aries Turner said:

That makes me wonder if QFs are really disposable or if they shut almost completely down and coast to a predefined recovery orbit, like an interplanetary ionic engine probe of shorts.

Given that most of the unmanned fighters we've seen in Macross are equipped with sophisticated AIs capable of semi-autonomous combat and thermonuclear reaction turbine engines, I doubt any of them are cheap enough to be considered disposable.  Less expensive to lose than a manned VF, sure, but not out-and-out disposable.

Though at least some of them are apparently capable of fully autonomous operation, they're generally operated semi-autonomously from an aircraft or ship outfitted for controlling a group of drones.  With operating time and range in space limited by available reactor fuel, it's unlikely they were ever operating at extreme long ranges away from a mothership which could either order them home or recover them itself.

Posted (edited)

Lilldraken are described as such, but I agree about the reaction engine and the sophisticated electronics. Maybe what was meant was not disposable, but attritable.

And such AIs should be very capable to work on emergency low power by shutting down unnecessary systems and even re-calibrate for damaged thrusters to still achieve limp-home capability, even with major damage.

[Edit] Wait! My mistake: I agree about the reaction engine and the sophisticated AI ALGORYTHMS, but not about the electronics, which should be a pretty boring SoC array.

Edited by Aries Turner
Grammar
Posted
14 minutes ago, Aries Turner said:

Lilldraken are described as such, but I agree about the reaction engine and the sophisticated electronics. Maybe what was meant was not disposable, but attritable.

Due to the general lack of detailed information from Macross Delta, it's hard to say how the Lilldrakens shake up cost- and hardware-wise.

What little we do know about the Lilldrakens does suggest there's been at least some economizing going on though.  Like the fact that they have energy converting armor but don't have enough generator output to actually power it, so it only runs when there is excess power being supplied from the mothership.  It's not clear if they're capable of continuing the fight even after their mothership is lost the way a AIF-7S/QF-4000 Ghost can.  If not, that may be another area where there was a cost-saving cut made.

 

14 minutes ago, Aries Turner said:

And such AIs should be very capable to work on emergency low power by shutting down unnecessary systems and even recalibrate for damaged thrusters to still achieve limp-home capability, even with major damage.

One of the points of unmanned fighters is that they have very little that could be called "unnecessary" in their systems... power to weapons would be about the only thing they could conceivably cut to save energy.  

Posted (edited)
57 minutes ago, Seto Kaiba said:

One of the points of unmanned fighters is that they have very little that could be called "unnecessary" in their systems... power to weapons would be about the only thing they could conceivably cut to save energy.  

- Most of the CPUs, particularly the more powerful ones on a Hi-Lo configuration, and corresponding memory banks for ECCM and target recognition.

- Radar if El-Op is enough, as radar is a really power hungry component.

- Power hungry high bandwidth data-link radios, switching to low power, low bandwidth radio beacons.

- Maybe even the reaction engine if thrusters are enough to limp back home.

Edited by Aries Turner
Posted
25 minutes ago, Aries Turner said:

- Most of the CPUs, particularly the more powerful ones on a Hi-Lo configuration, and corresponding memory banks for ECCM and target recognition.

ECCM and active stealth could be done without, but turning off object recognition seems like a bad idea since that'd prevent the drone from assessing what objects in its path are navigational hazards.

 

25 minutes ago, Aries Turner said:

- Radar if El-Op is enough, as radar is a really power hungry component.

25 minutes ago, Aries Turner said:

- Power hungry high bandwidth data-link radios, switching to low power, low bandwidth radio beacons.

"Power hungry" is relative.  On a modern aircraft these things are incredibly energy-intensive, but we're talking about a drone that has at least a couple hundred megawatts to throw around.

 

25 minutes ago, Aries Turner said:

- Maybe even the reaction engine if thrusters are enough to limp back home.

Now that's almost certainly a bad idea, given that that's the power source.

Posted
4 hours ago, Aries Turner said:

In VF-1 Space Wings there is an unofficial account of an unmanned VF-1 using low power acceleration for a long time to maximize endurance going to Saturn and back to the Moon. That makes me wonder if QFs are really disposable or if they shut almost completely when hiting Bingo Fuel and then coast to a predefined recovery orbit, like an interplanetary ionic engine probe of shorts.

It's a little different than that (it's not a QF per se).  Nevertheless, enjoy the full story: http://sdfyodogawa.mywebcommunity.org/OTvfmf/VFMFvf1valkyrieWingsOfSpace.php#082

Posted (edited)
6 hours ago, Seto Kaiba said:

9turning off object recognition seems like a bad idea

Turning off TARGET recognition. Object recognition can be done at way shorter ranges at lower resolution, processing power and memory requirements.

6 hours ago, Seto Kaiba said:

"Power hungry" is relative.  On a modern aircraft these things are incredibly energy-intensive, but we're talking about a drone that has at least a couple hundred megawatts to throw around.

 

6 hours ago, Seto Kaiba said:

Now that's almost certainly a bad idea, given that that's the power source.

(about turning off the reaction engine).

Notice I said "Maybe" and "if thrusthers are enough to limp back home". I should have specified "if thrusters AND battery/ condenser power", because of course electronics require power. There are certain situations where it may be unwise to turn on a reaction engine, battle damage being one of those, and here I am implying be may be already operating with backup measures without engine power. You can't count on major system for a backup fail-safe capability.

So yeah, in that situation, even turning off high bandwidth radios do count, and it may be unwise to do otherwise, as a drone on the brink of being totally inoperative and unable to follow even basic commands for lack of fuel/ power/ structural integrity can't be expected to provide remote surveillance capability.

--

Edited by Aries Turner
Posted
10 hours ago, Aries Turner said:

Turning off TARGET recognition. Object recognition can be done at way shorter ranges at lower resolution, processing power and memory requirements.

... you do realize we're talking about a drone in space, right?  Long-range detection and ranging of navigational hazards is essential if the drone is to survive at all, since the speed that navigational hazards in space are moving is going to be considerable and if the drone is budgeting propellant it will need to make its course corrections well in advance to avoid any collisions.  Otherwise it'll end up like the Project Trapeze VF-1 that got thoroughly trashed by micrometeor impacts.

 

10 hours ago, Aries Turner said:

(about turning off the reaction engine).

Notice I said "Maybe" and "if thrusthers are enough to limp back home". I should have specified "if thrusters AND battery/ condenser power", because of course electronics require power. There are certain situations where it may be unwise to turn on a reaction engine, battle damage being one of those, and here I am implying be may be already operating with backup measures without engine power. You can't count on major system for a backup fail-safe capability.

That's assuming there's even a high-energy capacitor bank or battery system capable of running the drone's systems long-term.

Even VFs aren't known to have that level of power system redundancy.

Posted (edited)
1 hour ago, Seto Kaiba said:

you do realize we're talking about a drone in space, right?  Long-range detection and ranging of navigational hazards is essential if the drone is to survive at all, since the speed that navigational hazards in space are moving is going to be considerable and if the drone is budgeting propellant it will need to make its course corrections well in advance to avoid any collisions.  Otherwise it'll end up like the Project Trapeze VF-1 that got thoroughly trashed by micrometeor impacts.

I do realize we're talking about Macross drones in space, where Pioneers and Voyagers sailed with way less technology. Where you can pinpoint an VF with utmost accuracy at extreme long ranges but you can settle for an Island-class or even the nearest planet if forced to use way less angular resolution. And as you said, QFs aren't meant to operate that far from fleets.

Micrometeor impacts, I concede is a major issue in the Macross universe compared with the real universe. Still, a battered QF could or could not endure yet another impact, but limping slowly, enemy mecha rather than micrometeors are the biggest concern. If the QF can't avoid that, the QF can't avoid that: no sense worrying about even more improbable 'IFs'.

 

1 hour ago, Seto Kaiba said:

That's assuming there's even a high-energy capacitor bank or battery system capable of running the drone's systems long-term.

High-energy capacitor? No, low should suffice for mostly intertial navigation at low thruster acceleration, and if those don't, I'll hang the engineer, because it is doable, have little cost, add little weight as it uses systems already present in a not that unconventional way, and could have saved billions of (yen like) credits.

Edited by Aries Turner
Posted
1 hour ago, Aries Turner said:

I do realize we're talking about Macross drones in space, where Pioneers and Voyagers sailed with way less technology.

Probes like the Pioneers and Voyagers weren't designed to operate in close proximity to planets, where high velocity debris is most likely to gather thanks to gravity.  The drones don't have that advantage, since most of them are employed in planetary defense and as advance patrols and first responders by emigrant fleets that, by their very nature, tend to loiter in close proximity to planets and other large stellar bodies.

 

1 hour ago, Aries Turner said:

Where you can pinpoint an VF with utmost accuracy at extreme long ranges but you can settle for an Island-class or even the nearest planet if forced to use way less angular resolution. And as you said, QFs aren't meant to operate that far from fleets.

That would tend to make the entire question academic.  If they're close enough to spot a mothership with a RADAR, LIDAR, or optical array then they're likely well with remote control range.

 

1 hour ago, Aries Turner said:

Micrometeor impacts, I concede is a major issue in the Macross universe compared with the real universe. Still, a battered QF could or could not endure yet another impact, but limping slowly, enemy mecha rather than micrometeors are the biggest concern. If the QF can't avoid that, the QF can't avoid that: no sense worrying about even more improbable 'IFs'.

All the more reason that maintaining object detection and ranging would be important.  If the drone is attempting to stay in one piece, flying directly past the enemy is not the way to do it.  (Nor, for that matter, would disabling ECCM and active stealth.)

Posted (edited)
54 minutes ago, Seto Kaiba said:

Probes like the Pioneers and Voyagers weren't designed to operate in close proximity to planets, where high velocity debris is most likely to gather thanks to gravity.

But Cassini did:

https://en.wikipedia.org/wiki/Cassini–Huygens

54 minutes ago, Seto Kaiba said:

If they're close enough to spot a mothership with a RADAR, LIDAR, or optical array then they're likely well with remote control range.

Controlling an still ongoing battle with still operational assets takes priority over recovery operations. Also, if the drone can do it on its own, a human observer/ controller would do best anywhere else.

 

54 minutes ago, Seto Kaiba said:

flying directly past the enemy is not the way to do it.

But flying directly towards a big vessel the enemy already is aware off, at low. almost dead speed, in the middle of ongoing chaos is still stealthy enough. Debris are debris, and enemy mecha will cerntainly be more occupied with your still operational assets. If not, they can pursue your ailing drones to dispatch those, but at that point, without operational assets, you may be already screwed.

54 minutes ago, Seto Kaiba said:

That would tend to make the entire question academic.

Without detailed QF subsystems as we have for almost any VF, that would be true. But if your octa-core mobile have that little extra low power NINTH core officially managing the 4/5G radio and unofficially peeking here and there, a QF could do with a 'backseat' core or cores that determines if the drone is fully operational or mostly inoperative, and then take charge, check remaining sensor systems, check still operational thrusters and measuring with internal MEMS accelerometer and quantum gyroscopes for deviations in expected acceleration, check if thruster are thus fully or partly operational, adjust for partly operational thruster giving acceleration in wrong vectors and create a new navigation model accounting for what is available to make the best use of what remains. All that a Raspberry Pi could do with meager power.

For all we know, even discarded super parts could have been doing that for the entire saga, because even without cameras, strategically placed pasive antennas could triangulate the vector towards a still transmitting friendly. Maybe even the vessel you were reporting to that is still trying to regain control, not knowing yet you may have been utterly disabled.

Edited by Aries Turner
Posted
 

But Cassini did:

Excluding the Cassini probe's trip through the gap between Saturn's F and G rings during orbital insertion - which required special measures to be taken to avoid damage even in the relatively debris-free space between the rings - it generally stayed at least a light-second from any planet or moon.  

That's quite a bit different from, say, flying through the Cassini divide laterally, in orbit of a planet with significant space traffic, or in an asteroid field being mined for resources by an emigrant fleet.

 

 

Controlling an still ongoing battle with still operational assets takes priority over recovery operations. Also, if the drone can do it on its own, a human observer/ controller would do best anywhere else.

That kind of assumes the human operator is directly controlling the unit... the way it's described is more like a RTS, where they're selecting a bunch of semi-autonomous units and pointing them in the general direction of something that needs its sh*t wrecked, assigning target priorities, etc.  One of the benefits of using 'em is that you can manage a large drone air force with relatively few meatbags.

 

 

But flying directly towards a big vessel the enemy already is aware off, at low. almost dead speed, in the middle of ongoing chaos is still stealthy enough.

As a rule, if an enemy force (e.g. a Zentradi Army branch fleet) have already spotted the emigrant ship... the entire situation has gone sufficiently pear-shaped that recovery is not likely to occur at all.  (Legging it is still the preferred solution in situations like that, and in some Master File accounts, including possibly self-destructing any ships that can't make their fold jump away to prevent capture.)

 

 

Debris are debris, and enemy mecha will cerntainly be more occupied with your still operational assets. If not, they can pursue your ailing drones to dispatch those, but at that point, without operational assets, you may be already screwed.

Depends what kind of enemy.  Some, like the Vajra, are known to home in on things like active fold wave emissions from communications systems, fold reactors, and the like, so the drone may still be drawing fire even if it's operating in low power mode simply because it presents as a threat and is an easy target.

It'd be less an issue for, say, an enemy VF that's using a mixture of RADAR, LIDAR, infrared, and optical sensing rather than fold wave detection since the only thing likely to draw attention is the drone's engine heat.

 

 

Without detailed QF subsystems as we have for almost any VF, that would be true. But if your octa-core mobile [...]

While we haven't had much information on how overtechnology-based artificial intelligence systems work in Macross on a hardware level, some of the terminology choices suggest they're using an entirely different architecture from anything we would recognize as a conventional computer.  This reasoning may or may not apply... doubly so given that some of the systems are apparently based on organic technology.  (Save for Macross II where the word "apparently" gets jettisoned from that sentence, because it's outright confirmed.)

 

 

For all we know, even discarded super parts could have been doing that for the entire saga, because even without cameras, strategically placed pasive antennas could triangulate the vector towards a still transmitting friendly. Maybe even the vessel you were reporting to that is still trying to regain control, not knowing yet you may have been utterly disabled.

Super Packs are something we've had a fair amount of insight into the inner workings of, and they're generally pretty bare-bones because they are intended to be disposable should the situation call for it.

The first several generations of Super Pack were fairly bare-bones affairs, being that they were mostly just high-capacity fuel tanks and hybrid rocket engines with some additional verniers.  The addition of weapons to the packs in the form of some simple micromissile launchers was almost an afterthought.  It wasn't until changes in engine technology and a larger average airframe greatly reduced the need for massive bolt-on fuel tanks in space that we started to see recoverable Super Packs.  That was when the focus of a Super Pack changed from extending operating time to offsetting the loss of performance caused by carrying absurd amounts of ordnance.  Those packs could even bunch up in orbit for later automatic reattachment if a fighter needed to make planetfall, so I wouldn't be surprised if they also had recovery beacons of their own... though normally if a Super Pack is being jettisoned during a fight, it's because the pack is damaged to the point that it's a liability or being used as an emergency anti-missile countermeasure.

Posted (edited)
 

Legging it is still the preferred solution in situations like that

You do realize we are talking about not-autonomous drones on bingo fuel or crippled? Legging was never an option for those: those are lost if that is the situation.

 

 

the drone may still be drawing fire even if it's operating in low power mode simply because it presents as a threat and is an easy target

If a crippled drone is drawing fire, it is working successfully as a decoy, drawing fire away from fully operational assets.

The same can be said if a crippled drone is drawing the least bit of focusing of a human controller, whether micro or macro RTS management: a lil' victory for the enemy in this case.

 

 

Super Packs are something we've had a fair amount of insight into the inner workings of, and they're generally pretty bare-bones because they are intended to be disposable

,,,and then...

 

Those packs could even bunch up in orbit for later automatic reattachment if a fighter needed to make planetfall, so I wouldn't be surprised if they also had recovery beacons of their own

 

"Automatic" implies a certain amount of electronics. Even if just four Raspberry Zero W with included antenna in a quadruplex redundancy configuration. Want something less complex? You'll be hard pressed, but you can still go for four ESP8266. And yes, those can control a drone.

https://en.wikipedia.org/wiki/ESP8266

Edited by Aries Turner
Posted
 

You do realize we are talking about not-autonomous drones on bingo fuel or crippled? Legging was never an option for those: those are lost if that is the situation.

The fleet legging it... meaning recovering the drone wouldn't be on anyone's mind anyway.

Also, since the verniers and main engines are pulling from the same tanks most of the time, a drone on bingo fuel wouldn't be maneuvering period.

 

 

The same can be said if a drone is drawing the least bit of focusing of a controller, whether micro or macro RTS management: a lil' victory for the enemy in this case.

It may not even be something that needs to be done by a human either.  When even the equivalent of a forklift has a support AI, C4I center on the mothership may have probably has some fairly grunty AI assistance so the human operators don't need to take their attention off the battle to poke individual recoverable assets one at a time.

 

 

"Automatic" implies a certain amount of electronics.

For the record, it isn't clear if that reattachment function is on the VF, the Pack, or both.

Posted

One interesting thing I noticed while I was reading the Project Trapeze section over lunch... for the long flight, it sounds like they converted that VF-1A's engines from an augmented fusion rocket to a fusion-powered thermal monopropellant rocket.  Kind of a neat way to think about long-duration spaceflight for a VF and possibly a fun little nod to Gundam's UC timeline as well.

Posted
 

a drone on bingo fuel wouldn't be maneuvering period

Now I see the missunderstanding: bingo fuel if when you have the fuel to barely reach a secondary landing zone, the indication you need to tail and go home or risk swimming. A drone on bingo fuel can maneuver. Just not for very long and you have to be extremely fuel conscious.

 

 

It may not even be something that needs to be done by a human either

...and then...

 

C4I center on the mothership may have probably has some fairly grunty AI assistance

...and yet then...

 

For the record, it isn't clear if that reattachment function is on the VF, the Pack, or both.

In the end, any command needs some kind of logic in the receiving end. To take commands from a grunty AI you still need at least an stupid AI. And those are dirt cheap to not include it even in the most menial equipment, even lowly automatic dispenser machines with harassing functionality.

Posted
 

Now I see the missunderstanding: bingo fuel if when you have the fuel to barely reach a secondary landing zone, the indication you need to tail and go home or risk swimming. A drone on bingo fuel can maneuver. Just not for very long and you have to be extremely fuel conscious.

I'm well aware of what bingo fuel means.

Space is rather different from atmosphere, however.  You don't need to continuously run the engine to stay flying, but you do need to save enough fuel to decelerate when you get where you're going and want to stop.  Thermonuclear reaction engines are fabulously inefficient in space flight, so a drone that has just enough fuel slush left to limp home is not a likely candidate to do anything except set a straight-line course home, burn the engines long enough to get up to speed, and save the rest of its go juice to decelerate for recovery.  

 

 

In the end, any command needs some kind of logic in the receiving end. To take commands from a grunty AI you still need at least an stupid AI. And those are dirt cheap to not include it even in the most menial equipment, even lowly automatic dispenser machines with harassing functionality.

Not necessarily, in the case of the Super Packs.  VFs with the linear actuator system are capable of some incredibly complex feats of electromagnetic field manipulation to keep parts in proper alignment during transformation and so on.  It's possible that same field manipulation is what is causing the "bunching" of the pack on ejection and the proper reconnection on the VF's return.  Putting an AI into every piece of the Super Pack would potentially take a trivial cost into nontrivial territory and incur additional costs from the support systems needed to keep those AIs running.

(There are a few isolated instances of FAST Packs having internal power sources, mainly capacitors but in one case a small reactor, but most packs operate solely on external power from the VF itself.)

Posted (edited)
 

a drone that has just enough fuel slush left to limp home is not a likely candidate to do anything except set a straight-line course home, burn the engines long enough to get up to speed, and save the rest of its go juice to decelerate for recovery. 

Then we agree the best course of action is just locate your big mothership, a shiny dot moving against the background from which control transmissions were sent just a moment ago, manage to vector an intercept course, not caring in any way about enemy vessels or meteoroids against which you can't even maneuver at this point and hope for the best, as I said already.

 

Putting an AI into every piece of the Super Pack would potentially take a trivial cost into nontrivial territory and incur additional costs from the support systems needed to keep those AIs running. 

The key here is what do we call an AI, giving the above limitations that force even simpler behavior than a trash disposal robot. I started with a reference which Mr. Sketchley generously provided, so I'll drop another:

http://macross2.net/m3/sdfmacross/robots-2012.htm

Putting an AI into every piece of trivial technology is arguably what Macross is all about. Dirt cheap is still trivial cost: you can't afford not embedding those. Not doing so could make you lose valuable equipment.

No, really, there is a lower limit on chip size where if the logic you really require sits in a corner, leaving the rest unpopulated have the same cost than populating it with gadgetry you may o may not use in future requirements. That is why you have a phone: entire systems on chip. OTEC make for lower limits but also lower sizes, so you'll end with the same issue, putting entire IA level circuitry in a wheel microcontroller. Either you limit the software to do just that, or you upload upgraded firmware over time to do what we see on screen.

The chip inside this, without OTEC, sits on a dollar cent. It could do better with liquid hydrogen refrigeration inside a fuel tank than with aluminium disipators and rotating coolers.

https://coral.withgoogle.com/

Edited by Aries Turner
Posted

You know, that M3 link reminds me...

 

I've long been curious how that dang Petit Cola machine was POWERED. Refrigeration isn't exactly a low-energy process, and I'm FAIRLY sure they haven't installed microfusion generators in friggin' coke machines.

Are they using Tesla-style remotely-beamed power? Do they have a short time they can be away from their power dock before the drinks start warming up? Do they just carry a tank of liquid nitrogen for cooling? IMPORTANT QUESTIONS ARE BEING ASKED HERE!

Posted (edited)
 

You know, that M3 link reminds me...

 

I've long been curious how that dang Petit Cola machine was POWERED. Refrigeration isn't exactly a low-energy process, and I'm FAIRLY sure they haven't installed microfusion generators in friggin' coke machines.

Are they using Tesla-style remotely-beamed power? Do they have a short time they can be away from their power dock before the drinks start warming up? Do they just carry a tank of liquid nitrogen for cooling? IMPORTANT QUESTIONS ARE BEING ASKED HERE!

Maybe they get their cooling from sound? https://phys.org/news/2016-12-refrigerator-multistage.html

(the link isn't exactly what I had in mind... saw something on Discovery Channel that predates it by about 5 years)

 

Cool Macross sound connection aside, I don't think Kawamori-san et al had that in mind.  It's probably something mundane like OTEC enhanced batteries, and the vending machines have to return to a charging station (storage) every so often (like the tin can collecting robots in DYRL—waiting until needed, or an unsuspecting passerby stumbles too close, and they can start harassing them to make a purchase! :p )

I wonder how many harassment or stalking lawsuits Petite Cola has gotten over the years? (ANOTHER IMPORTANT QUESTION HERE!)

Edited by sketchley
Posted
 

Then we agree the best course of action is just locate your big mothership, a shiny dot moving against the background from which control transmissions were sent just a moment ago, manage to vector an intercept course, not caring in any way about enemy vessels or meteoroids against which you can't even maneuver at this point and hope for the best, as I said already.

Within the bounds of the very, VERY specific corner case you've constructed here... yes.

 

 

Putting an AI into every piece of trivial technology is arguably what Macross is all about.

... which naturally explains why the technology is nowhere to be found in Macross outside of high-end military hardware1, litter-picking robots, and vending machines.

 

 

 

No, really, there is a lower limit on chip size where if the logic you really require sits in a corner, leaving the rest unpopulated have the same cost than populating it with gadgetry you may o may not use in future requirements. That is why you have a phone: entire systems on chip.

Ah, I see the reason for the misconception... you're operating under the common misconception that the system-on-chip in something like a phone is doing the thinking locally.  The reason those applications of very, VERY rudimentary heuristics work at all is 1. because of how basic they are and 2. because much of the actual processing burden is non-local... it's being done external to the system-on-chip on a remote server or in "the cloud2".  When you try to do the thinking locally, it requires a lot more system resources.

Even seemingly simple tasks like driving a vehicle on well-mapped public roads (Level 3 or 4 vehicular autonomy) requires a surprising amount of power just to handle sensor input analysis.  So much so that those computers can end up weighing hundreds of pounds when you factor in their power and cooling requirements.  (This I know from firsthand professional experience.)

 

 

I've long been curious how that dang Petit Cola machine was POWERED. Refrigeration isn't exactly a low-energy process, and I'm FAIRLY sure they haven't installed microfusion generators in friggin' coke machines.

Are they using Tesla-style remotely-beamed power? Do they have a short time they can be away from their power dock before the drinks start warming up? Do they just carry a tank of liquid nitrogen for cooling? IMPORTANT QUESTIONS ARE BEING ASKED HERE!

Likely by a high-efficiency battery or fuel cell.  Given the many interesting things Overtechnology has achieved with carbon allotropes, I'd assume it's probably something analogous to a stacked-graphene supercapacitor.  Even with today's technology those achieve similar energy density and slightly better performance than modern lithium-ion batteries.  OTM would probably better that substantially and give those machines a range measured in weeks or hundreds of miles.

There are some entertaining alternatives like magnetic induction, using a receiver pad in the underside of the unit and charging pads buried in the sidewalk3.  This would keep those vending machines on the sidewalk and potentially allow them to self-limit their range to just a certain array of associated charging pads.  They could be using something like Tesla's wireless power concept (though I doubt it since that saturates the entire area with a charge), or potentially even be operating on a small hydrogen combustion generator like you'd find in a series hybrid car.  Ones operating outdoors could potentially use high-efficiency solar, either with a material like vantablack paired with the high-efficiency thermoelectric converters we know exist (because they're used in thermonuclear reaction generators4) or maybe just photovoltaics.

Cooling would be an interesting proposition.  Electrically-driven conventional mechanical refrigeration is one option, of course.  If they're operating on the hydrogen combustion generator option they could potentially be using the cryogenic fuel itself as a coolant to keep the beverages frosty cold, which would also serve to warm the fuel for the injectors5.  Thermoelectric cooling is another option, again thanks to OTM-based superefficient thermoelectric technology.  Maybe the cans themselves have a small flask of liquid CO2 that releases during dispensing to flash-cool and carbonate the drink (using a CO2 fire extinguisher is a surprisingly effective method to rapidly cool drinks on a hot day).

I'm a bit curious how autonomous those vending machines actually are and how they detect customers.  I'd guess they're probably semi-autonomous, remotely managed by some central system that is tracking inventory and the locations of the various machines so it can recall them for restock or repair.  They definitely have rudimentary voice recognition, as they respond to shouts of "cola" in Super Dimension Fortress Macross, coupled with directional microphones to allow them to locate the speaker.  I wonder if they're using LIDAR or RADAR to track the terrain to make sure they don't tip over on the curb or stray into a navigational hazard like traffic.  The unit definitely seems to want to rotate towards whoever hailed it, so I'm guessing it's not full 360 degree sensing.  I've got a keen suspicion that the way it differentiates potential customers from other objects is by infrared.  It's probably setting the ambient temperature as the background level and then looking for objects warmer than that around 36-38 degrees C.  Maybe it's using a laser rangefinder to ensure it stops at a comfortable distance from its summoner to ensure it doesn't accidentally mow them down?

 

 

1. Sharon Apple doesn't count as a separate category because her AI was military-grade hardware that was shared with the AIF-X-9 Ghost... and she probably would have been used for the Minmay Attack if she'd been completed in a stable state.
2. As a network engineer, I loathe the term "the cloud".  It's obfuscatory language at best.  There is no cloud, it's just someone else's computer.
3. This technology is being toyed with for extending the range of pure-electric cars, embedding wireless charging pads in the surface of the road that switch on as a vehicle passes over them and off again when the vehicle is past them, allowing it to be "plugged in" and actively recharging while driving.  This tech was proposed back in '72 by Professor Don Otto of the University of Auckland and has gone into practical trials on a few stretches of road in Britain in the last few years.
4. One of the two power stages of a thermonuclear reaction turbine engine, the other being a high-efficiency MHD dynamo.
5. Variable Fighter Master File: VF-1 Valkyrie Vol.2 indicates the VF-1 uses the hydrogen fuel slush in its tanks as a coolant for its engines prior to introducing it into the reactor in this fashion.

Posted (edited)

... since we're talking of rogue vending machines, this needs to be posted:

("Aggressive" vending machine designed and built by Jamie Hyneman of M5 Industries and Mythbusters fame.)

 

 

Edited by Seto Kaiba
Posted (edited)
 

which naturally explains why the technology is nowhere to be found in Macross outside of high-end military hardware1, litter-picking robots, and vending machines.

...litter-picking robots way more clever than a Roomba. That is like saying something is nowhere to be found except everywhere, including levitating mics that position themselves in front of a singer just when needed and then fly away without hitting anything when unneded.

 

 

The reason those applications of very, VERY rudimentary heuristics work at all is 1. because of how basic they are and 2. because much of the actual processing burden is non-local... it's being done external to the system-on-chip on a remote server or in "the cloud2".  When you try to do the thinking locally, it requires a lot more system resources.

Ah, I see the reason for the misconception: the reason those applications work "in the cloud" is because Google business model is all about the cloud. You can install LibreOffice or even full Linux distros on Android phones, those are executed locally in a mobile. This I know from firsthand professional experience, making Citrix terminals of those. And before you point out Citrix is all about remote execution, remember all is done on a local web browser, and a friggin web browser is no laughing matter, nor is H.265 decompression WITHOUT hardware decoders.

So your civilian litter-picking robot or Petite Cola vending machine does better than a self-driven car reading the wrong speed limits while on a highway? That is OTEC for you.

 

I wonder how many harassment or stalking lawsuits Petite Cola has gotten over the years? (ANOTHER IMPORTANT QUESTION HERE!)

Still annoying people in 2045, even more than Basara-san.

Edited by Aries Turner
Posted
 

...litter-picking robots way more clever than a Roomba.

Granted, the litter-picking robot would have to be marginally cleverer than a Roomba... but the Roomba is about as basic as your consumer-grade robotics get.  Its program is little more than a scheduler, a motor-RPM-based distance measurement algorithm, and a directive to turn ninety degrees when it bumps into something or is half a device radius from overlapping its path.  It really isn't beyond the scope of what a high school robotics team could knock out. 

That said, the litter picking robot's additional complexity wouldn't be enough to put it out of the reach of the high school robotics crowd either.  The additional embedded controls for the arm are almost literal child's play these days, making image processing that design's only real improvement over the Roomba.  

 

 

That is like saying something is nowhere to be found except everywhere, including levitating mics that position themselves in front of a singer just when needed and then fly away without hitting anything when unneded.

I've seen a lot of unsubstantiated fan theories over the years, but this is one of the odder ones...

Has it occurred to you that, as a microphone in a concert venue, it's vastly more likely that it's simply a remote-controlled device run by the venue's sound engineer in response to predefined cues in a choreographed performance with a set list?  (Basara's a prick, to be sure, but even he doesn't generally deviate from the set list.)

Your contention that AI technology is everywhere in Macross is entirely unsubstantiated.

(Mind you, I'd be curious how that microphone is hovering.  There's no evident thrust and it's too small to be contragravitc.  Maybe a Biefeld-Brown effect electrohydrodynamic lifter?  Though I guess you wouldn't want to touch it if that were the case.)

 

 

Ah, I see the reason for the misconception: the reason those applications work "in the cloud" is because Google business model is all about the cloud. You can install LibreOffice or even full Linux distros on Android phones, those are executed locally in a mobile.

I'm not sure where you thought you were going with this one... there's a pretty significant difference in processor, memory, and sensor utilization between a barebones open source word processor than, say, a piece of software that's trying to convert your speech into text, divine your intent, and convert it into actionable instructions for itself and other applications.  The reason the system-on-chip AI software depends so heavily on external processing is because doing all the processing locally would need significantly greater local storage requirements and place far greater demands for resources on the processor, memory, and energy storage system to do the job with anything close to the same level of accuracy.

 

 

So your civilian litter-picking robot or Petite Cola vending machine does better than a self-driven car reading the wrong speed limits while on a highway? That is OTEC for you.

Well, maybe better than a Tesla... but their entire autonomy stratgy seems to be "make misleading statements, backpedal, promise there'll be a patch in the future".

The vending machine and litter picking robot are both within the reach of today's technology, and the litter picking robot would be little more sophisticated than a consumer-level robotic vacuum cleaner or lawnmower.  The vending machine would be nearly as complex as an autonomous car, mainly to avoid damaging themselves and hurting people.  Either could be available today, but in practice they aren't because they are "awesome but impractical".

Posted
 

Has it occurred to you that, as a microphone in a concert venue, it's vastly more likely that it's simply a remote-controlled device run by the venue's sound engineer in response to predefined cues in a choreographed performance with a set list?

Yeah, it did. And for the shake of argument lets assume that assumption is correct, even if not (see later). Such precise and immediate movement when Mylene is turning her head, the mic never stopping being right in front of her mouth, describing an arc, can't be executed with manual controls. As an operator, you'll need a BDI/ BDS/ Brain Direct Interface System for such immediate, precise, no delays nor glitches kind of movement. But BDI is itself a form of advanced AI that transform intent into motion. It would require tremendous amount of power, but a sound studio, even if CIVILIAN could pull the feat. But that also would mean AIs are everywhere, defeating your counterargument.

But there is more: THE MICS ARE PRESENT IN A PRACTICE SESSION IN BASARA's HOME. I don't envy that sound engineer, ready 24/7 at the expense of Basara's whims. May even purchase full AI support for the band with the savings of a lifetime just to be able to sleep again.

 

The reason the system-on-chip AI software depends so heavily on external processing

Converting speech to text and divining your intent can be done with your phone in aircraft mode. If you are searching for the nearest restaurant and you have the ~100MB map of your area, both Siri and Google Now will point that to you, without the otherwise obligated internet search. There are very good reasons for external processing, but SoCs are, right now, way more capable that you seem aware of for a Telecom Engineer. But don't believe me: just put your phone in aircraft mode and give it a try.

 

 

The vending machine and litter picking robot are both within the reach of today's technology

Prove it. Not even Amazon's promised Quadcopters with the things you bought. A vending machine would step into feets, fall over broken asphalt, be stopped by unmapped railguards, run into unsafe areas and endanger road lanes,..

Posted
 

Yeah, it did. And for the shake of argument lets assume that assumption is correct, even if not (see later). Such precise and immediate movement when Mylene is turning her head, the mic never stopping being right in front of her mouth, describing an arc, can't be executed with manual controls.

It can be easily achieved by computer control, and unlike AI we can show that computers have infiltrated all manner of areas that they wouldn't until decades later in the real world like tablets having replaced newspapers, television cameras being remotely operated drones, etc.  Program the cues, and let the computer handle the timing of the mics.  It doesn't need to be an AI, it just needs a little prior planning not dissimilar to the cues used to turn mics on and off on theater performances and concerts when a mic isn't needed.

Even if it were an AI managing it, it wouldn't establish that AI technology is ubiquitous in Macross.  It would just confirm what we already knew that rudimentary AIs can be used to take over certain jobs nobody wants to do like picking rubbish or putting a microphone in front of Basara without beating him with the stand while most of the rest of technology's seemingly unimpacted.  

I could see a stronger case for it if the microphones responded to, say, gestures (like summoning one with a wave of the hand), demonstrated collision avoidance behavior, or could follow Fire Bomber outside the stage, all of which would require a lot more immediate, precise control outside of what could be preprogrammed.

 

 

But there is more: THE MICS ARE PRESENT IN A PRACTICE SESSION IN BASARA's HOME. I don't envy that sound engineer, ready 24/7 at the expense of Basara's whims. May even purchase full AI support for the band with the savings of a lifetime just to be able to sleep again.

While I've not been around many bands trying to make it big, even the amateur musicians I know tend to own at least a microphone or two and some entry-level mixing equipment... it would not unreasonable for Ray to have purchased that sort of thing for practice.

 

 

Converting speech to text and divining your intent can be done with your phone in aircraft mode. If you are searching for the nearest restaurant and you have the ~100MB map of your area, both Siri and Google Now will point that to you, without the otherwise obligated internet search. There are very good reasons for external processing, but SoCs are, right now, way more capable that you seem aware of for a Telecom Engineer. But don't believe me: just put your phone in aircraft mode and give it a try.

Yes, it can... but at a lower level of precision and with a lot fewer features.  That's point I've been tilting at here.  Yeah, you can technically run some of these rudimentary AI features exclusively on a system-on-chip, but at the cost of significant demand on processor time, memory, local storage, and energy that's generally greater than what you would see in an operating environment where you can displace those resource-intensive operations to a less limited system.  This is why many modern AI technologies are network-dependent.

 

 

Prove it. Not even Amazon's promised Quadcopters with the things you bought. A vending machine would step into feets, fall over broken asphalt, be stopped by unmapped railguards, run into unsafe areas and endanger road lanes,..

The technology already exists for autonomous vehicles.  LIDAR systems to monitor the movements of other vehicles, people, animals, road markings, hazards, and so on in the proximity of the vehicle, low-power RADAR for short-range collision avoidance, infrared for living object detection and motion tracking is commonly used in the XBox's Kinect peripheral and Nintendo's Wiimote, a decently grunty lithium ion battery can be pillaged from something like an electric wheelchair or a small electric golf cart, emotors can be obtained at shockingly compact sizes that with a few modest gear reductions can easily develop enough torque to shift a vending machine (see Jamie's mechanical ascender in Mythbusters's superhero special, which uses an emotor about the size of a soda can and a few gear reductions to lift a standard 95th percentile male).  Really, a robotic vending machine is basically just that robotic mall cop with a minifridge strapped to it.

All the various pieces exist but, as I said, it's "awesome but impractical"  All that extra expense for what actual gain besides making an already expensive vending machine MORE expensive without materially improving its ability to do its job.

Posted
 

Yes, but can an AI tell us whether Petit Cola tastes more like Coca-Cola or Pepsi?

Watch it turns out to be the South Ataria version of RC cola. 

Posted
 

Watch it turns out to be the South Ataria version of RC cola. 

If the Petit Cola Company has taste that good, it is no wonder they apparently dominate the industry.

Posted
 

Yes, but can an AI tell us whether Petit Cola tastes more like Coca-Cola or Pepsi?

Yes, but sentient AI is such a psychotic crapshoot it had to be outlawed entirely... so you know it prefers store brand cola.

Posted
 

Program the cues, and let the computer handle the timing of the mics.  It doesn't need to be an AI, it just needs a little prior planning

You can't program for spontaneous head movement. But anyway:

https://www.merriam-webster.com/dictionary/artificial intelligence

Definition of artificial intelligence

1 : a branch of computer science dealing with the simulation of intelligent behavior in computers
2 : the capability of a machine to imitate intelligent human behavior.
 
I rest my cas... NO, WAIT!
 
 

I could see a stronger case for it if the microphones responded to, say, gestures (like summoning one with a wave of the hand)

It is funny you thought about that, because that actually happens in the aforementioned sequence at Basara's home: Basara waves his hand and the flying mic disengages.

NOW I rest my case.

 

but at a lower level of precision and with a lot fewer features.  That's point I've been tilting at here. 

I don't see the point for a Telecom Engineer beating the bush with things even he knows wrong. More cores or more storage don't enhance precision, but speed of processing and amount of data, respectively. In layman terms: either Siri have the map to point you to the nearest restaurant or she doesn't even understand the language you are talking to her, either the drone have the map of the area it is navigating or it crashes into Mt. Erebus.

 

The technology already exists for autonomous vehicles.

The HARDWARE SENSORS do exists. The SOFTWARE making sense of the input, the AI, doesn't. Most successful autonomous vehicles navigate in uncontested airspace, ascending to operational altitude circling the controlled airspace of a military base. So in fact SOFTWARE AI exist, within that strict limits of operation. BUT it doesn't exist in a level to operate within civilian crowded airspace, besides Greenland and the Arctic. Much less withing populous cities.

Posted
 

You can't program for spontaneous head movement.

You can program for a choreographed performance, or to have the unit hold position relative to a part of a person's body.  (Position-holding could be achieved by something similar to beam-riding guidance, with them wearing some directional transmitter that the microphone is programmed to stay centered in the path of.)

 

 

It is funny you thought about that, because that actually happens in the aforementioned sequence at Basara's home: Basara waves his hand and the flying mic disengages.

I legitimately did not recall that occurring in the series, but then it has been good while I last rewatched Macross 7.  I concede the point that the microphone system may have a basic AI for guidance.  That doesn't establish AI as ubiquitous in the Macross setting, but it does establish one more (temporary) category where AI made inroads into the consumer-level technology of the Macross universe.  (One does have to wonder why they were seemingly abandoned in favor of conventional microphones later on though... did someone lose an eye or something?  We do see civilian drones, but curiously they seem to largely be little more sophisticated than what we have today.)

 

 

I don't see the point for a Telecom Engineer beating the bush with things even he knows wrong. More cores or more storage don't enhance precision, but speed of processing and amount of data, respectively. In layman terms: either Siri have the map to point you to the nearest restaurant or she doesn't even understand the language you are talking to her, either the drone have the map of the area it is navigating or it crashes into Mt. Erebus.

You're looking at it from the opposite direction I am, which I suspect is why my point isn't getting across here.

If you want to support a software feature, you need to have the available system resources to support it and ideally you don't want to redline the system doing it.  This is why most consumer-grade technology with rudimentary AI is extremely simple (e.g. the Roombas) or network-based.  Even basic AIs tend to have surprisingly high system requirements, and that leaves you a choice of expanding the available sources locally (more cost), running the system ragged (reduced lifespan both in operating time and overall durability), or you outsource the complex tasks to another system with more resources.  This is why those cheap system-on-chip setups you keep referring to are cheap... because the AI features they support are largely cloud-based to avoid the additional cost and complexity of faster processors, more memory, more local storage, and more capacious batteries.  The more you want the system to support locally, the more demand on the local hardware and the more expensive the unit gets.

Now, if we're talking about something like a FAST Pack that's meant to be as aggressively inexpensive as it can be for the military, there's two questions to ask.  What benefit would having an AI on the pack be, and is it sufficient to justify additional cost?  As I see it, there's no real benefit to it when the packs are tied directly into a military-grade AI right there inside the fighter and they usually contain nothing but fuel tanks, verniers, rocket boosters, and weapons systems slaved to the fighter's FCS.  Same story for most other military hardware.

 

 

The HARDWARE SENSORS do exists. The SOFTWARE making sense of the input, the AI, doesn't. Most successful autonomous vehicles navigate in uncontested airspace, ascending to operational altitude circling the controlled airspace of a military base. So in fact SOFTWARE AI exist, within that strict limits of operation. BUT it doesn't exist in a level to operate within civilian crowded airspace, besides Greenland and the Arctic. Much less withing populous cities.

... "autonomous vehicle" usually refers to cars.  When it's aircraft, they're usually called Unmanned Aerial Vehicles.

Level 3 and 4 autonomous cars do, in fact, have the software to receive and interpret the inputs from things like LIDAR arrays, RADAR, and so on in order to safely navigate public roads, observe and follow lane markings, prevent collisions with other vehicles or pedestrians, etc.  The same technology could be applied to create an autonomous robot vending machine, but it would be a significant increase in cost to an already expensive vending machine for no real gain besides the "ain't it cool?" factor.  Being an engineer, I'm all about making cool sh*t for its own sake, but I'd imagine potential customers are gonna want a business case demonstrating why it's an advantage over a static machine besides being a novelty (because the novelty will wear off quickly).

Posted (edited)
 

You can program for a choreographed performance

You can't program a person for choreographed precise performance, and that mic did not deviate from Mylene mouth even a micra. If the animators cared that much for such a complex scene, there must be a reason.

 

One does have to wonder why they were seemingly abandoned in favor of conventional microphones later on though

Delta didn't use conventional microphones: Frontier did. And they refurbished or replicated an SH-60 Seahawk for no practical reason, as it was there even before filming Legend of Zero.

 

 

This is why those cheap system-on-chip setups you keep referring to are cheap... because the AI features they support are largely cloud-based to avoid the additional cost and complexity of faster processors, more memory, more local storage, and more capacious batteries.

I don't fail to see your point: you fail to address that, while the above is an useful trick, it is not mandatory, and lots of cheap drones do not use any kind of cloud based extra processing power nor storage, but inbuilt eMMC solid storage and mere KiB of RAM. Still, I can't understand why this an objection: so that AI you suggest isn't entirely located in any physical location, but is partly remote. So what? That makes QFs even cheaper, and still able to resort to dumbed down, no comms available mode.

Like bazillion Petit Cola machines in hot pursuit of a poor fella down the road, all controlled from Petit Cola Co or a subsidiary Master Control Progral centralized AI. Suddenly comms are heavily jammed for Beauty of Jamming reasons, and all the machines enter recovery mode, without all the fancy "your emotional state suggest Petit Berry", and backtrack to the Central for further instructions, at safe speed.

 

"autonomous vehicle" usually refers to cars.  When it's aircraft, they're usually called Unmanned Aerial Vehicles.

Wrong: the military despises the term as much as the IT crowd despise "cloud", because Uninhabited Aerial Vehicles are  manned at all times from somewhere.

BTW, while LIDAR active sensors may be useful for civilian vehicles (or detecting submerged mines from a chopper), it doesn't solve the other part of the equation for safe navigation: reading RELEVANT traffic signals. That is where the AI fails. A robot vending machine can enter an unobstructed area faultlessly to then detect with accurate LIDAR precision unavoidable road traffic, because it failed to interpret a crosswalk variant and a fancy LED animated DO NOT CROSS street light.

Edited by Aries Turner
Beauty of Edition

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...