Haven't done a 2.0 in a while now. Mostly just because I realized after the last one that I've been out of the loop. I used to be on top of tech news but I've been grossly slacking. My resources are and have been pretty out of date. Needless to say, even my own innovations have been lacking as well. I want to get back into that. So here we are I guess.
Looking at the way things are going right now, I've found myself looking toward smaller innovations. Things to augment the tech we already have now. Broad spectrum stuff like the 'modular model' are on my mind but I can't say in earnest that such is something I truly want to discuss here. At least not for now. Still have to clear some stuff up in my head before I just put it out there so I guess I'll stick with a much smaller idea. I present to you, "AI."
Nothing fancy, I'm just talking about Artificial Intelligence. I know, I know, already a thing, but then I think there's the issue. It's already a thing but it's implementation is so infantile that it's baffling. At least to me. Honestly. How is it we have pseudo-AI and we're still having "disagreements" at a [insert highest level of government] level on literally ANYTHING that falls solely on statistics to find trends/patterns that can be used to determine efficient and effective measures for the populous as a whole? Likely the same reason why a robot apocalypse is very possible. Because the people who run the world (publicly) grew up in an age when computers didn't exist. And because of that, we're all doomed to be ill-prepared for the transition period. Not going to say WWIII but well, WWIII. This is the precipice of the next age. But this is digital. The lag behind will be like first versus third-world. Of course third-world will then be the equivalent to like tenth at that point though. And just like that we're slowly shifting into a Mainstay. Guess I'll have to re-brand and try again at a later date.
True AI has been described as being infantile by those in the field because common understanding (amongst the scientific community) is that understanding of human intelligence is still so limited. That's laughable to me. I keep seeing things like that--see such as great debate with autonomous vehicles tackling moral dilemmas--and wonder where the disconnect seems to be propagating from. At the core of computer science is binary. At the core of humanity? Generally speaking? DNA. Now I'm overly simplifying this but bare with me here. We're talking a difference of two on a scale that expands outward in factors almost infinitesimal. So, it matters. The difference that is. It matters and with it is the problem. Not emulation, but the expectation that computationally artificial intelligence would behave like that of a human. What the hell!? How? It's baffling to me. As much as it is nonsensical. But here were are. "Lost in Translation." I should link back to one of my more recent Mainstays but lazy. Okay, not lazy enough I guess. Anyway, correlation right? Miscommunication. There is a gross disconnect and, main point, that's what's holding humanity back from the next stage of evolution. Bold statement? Sure. But this is why it's a mainstay now. So close it up.
Full disclosure. I have been seeing some scarce but very real rumblings of like-minded individuals but they seem to be drowned out by mainstream. Or then again, could just be due to my outdated sources. Either way, feels a bit better to finally clear this off the mind a bit. Even though it feels a bit derivative and is a stark departure from the original goal but flow is flow. We just let it take us, so sayeth the writer or some proverbial BS and whatnot. Back to the reg stuff some ever. Take care.