There was some talk I heard a while ago about some company flirting with the notion of an AI executive of some kind, maybe CEO. And now knowing a bit more about smart contracts, such a notion seems incrementally more feasible.
I wonder if the growth of capitalism can be preserved, to some degree, without requiring humans to get uber rich. I could be thinking too far ahead, and perhaps even overstating it at that, but smart contracts in their essence seem to outsource contract officiation to algorithms, and that seems pretty broad and totipotent.
And, in principle and perhaps naively, it would seem that, once more and more of industry and life becomes data-driven, eventually certain machine-learning modules may know certain swathes of their given industry better than human experts would.
If the robustness (quality) of data capture increases, as well as the volume (quantity) of data captured, then it would seem to follow that machine learning modules, which are fluent in this data, would become veritable experts and would command an acumen that only exemplary human ingenuity could best.
That said, I don't see a problem with preserving the administrative capabilities of human programmers, again to sidestep the obvious dystopian forecasts.
I've also just started learning about programming languages, and my understanding is quite messy as of now. I began considering the prospects of thinking in so systematic a way as to express oneself, as it were, as a programming language. I would assume those who have written programming languages must have had comparable experiences, if not so philosophical.
Could constitute a step closer, albeit an arcane one, to AGI, perhaps in terms of how we translate the kind of knowledge we would now consider distinctly human, into a more programmatic and algorithmically conducive formatting.