Buckets of Crumbs!!!
I just posted a way deeper and more interesting blog post a couple hours ago (using multiverse theory and Occam's Razor to explain why voting may often be rational after all), but I decided to post this sillier one tonight too because I have a feeling I'll forget if I put it off till tomorrow (late at night I'm willing to devote a little time to blogging in lieu of much-needed sleep ... tomorrow when I wake up there will be loads of work I'll feel obliged to do instead!)
This blog post just re-"prints" part of a post I made to the AGI email list today, which a couple people already asked me if they could quote.
It was made in response to a poster on the AGI list who made the argument that AGI researchers would be more motivated to work on building superhuman AGI if there were more financial gain involved ... and that, in fact, desire for financial gain MUST be a significant part of their motivation ... since AGI researchers are only human too ...
What I said is really simple and shouldn't need to have been said, but still, this sort of thing seems to require constant repetition, due to the nature of the society we live in...
Here goes:
Singularitarian AGI researchers, even if operating largely or partly in the business domain (like myself), value the creation of AGI far more than the obtaining of material profits.
I am very interested in deriving $$ from incremental steps on the path to powerful AGI, because I think this is one of the better methods available for funding AGI R&D work.
But deriving $$ from human-level AGI really is not a big motivator of mine. To me, once human-level AGI is obtained, we have something of dramatically more interest than accumulation of any amount of wealth.
Yes, I assume that if I succeed in creating a human-level AGI, then huge amounts of $$ for research will come my way, along with enough personal $$ to liberate me from needing to manage software development contracts or mop my own floor. That will be very nice. But that's just not the point.
I'm envisioning a population of cockroaches constantly fighting over crumbs of food on the floor. Then a few of the cockroaches -- let's call them the Cockroach Robot Club -- decide to spend their lives focused on creating a superhuman robot which will incidentally allow cockroaches to upload into superhuman form with superhuman intelligence. And the other cockroaches insist that the Cockroach Robot Club's motivation in doing this must be a desire to get more crumbs of food. After all, just **IMAGINE** how many crumbs of food you'll be able to get with that superhuman robot on your side!!! Buckets full of crumbs!!!
(Perhaps after they're resurrected and uploaded, the cockroaches that used to live in my kitchen will come to appreciate the literary inspiration they've provided me! For the near future though I'll need to draw my inspiration elsewhere as Womack Exterminators seems to have successfully vanquished the beasties with large amounts of poisonous gas. Which I can't help feeling guilty about, being a huge fan of the film Twilight of the Cockroaches ... but really, I digress...)
I'm also reminded of a meeting I was in back in 1986, when I was getting trained as a telephone salesman (one of my lamer summer jobs from my grad school days ... actually I think that summer I had given up on grad school and moved to Las Vegas with the idea of becoming a freelance philosopher ... but after a couple months of phone sales, which was necessary because freelance philosophers don't make much money, I reconsidered and went back to grad school in the fall). The trainer, a big fat scary guy who looked and sounded like a meaner version of my ninth grade social studies teacher, was giving us trainee salespeople a big speech about how everyone wanted success, and he asked us how success was defined. Someone in the class answered MONEY and the trainer congratulated him and said: "That's right, in America success means money, and you're going to learn to make a lot of it!" The class cheered (a scene that could have been straight out of Idiocracy ... "I like money!"). Feeling obnoxious (as I usually was in those days), I raised my hand and asked the trainer if Einstein was successful or not ... since Einstein hadn't been particularly rich, I noted, that seemed to me like a counterexample to the principle that had been posited regarding the equivalence of success and financial wealth in the American context. The trainer changed the subject to how the salesman is like a hammer and the customer is like a nail. (By the way I was a mediocre but not horrible phone salesman of "pens, caps and mugs with your company name on them." I had to use the name "Ben Brown" on the phone though because no one could pronounce "Goertzel." If you were a small business owner in summer 1986 and got a phone call from an annoying crap salesman named Ben Brown, it was probably the 19 year old version of me....)
This blog post just re-"prints" part of a post I made to the AGI email list today, which a couple people already asked me if they could quote.
It was made in response to a poster on the AGI list who made the argument that AGI researchers would be more motivated to work on building superhuman AGI if there were more financial gain involved ... and that, in fact, desire for financial gain MUST be a significant part of their motivation ... since AGI researchers are only human too ...
What I said is really simple and shouldn't need to have been said, but still, this sort of thing seems to require constant repetition, due to the nature of the society we live in...
Here goes:
Singularitarian AGI researchers, even if operating largely or partly in the business domain (like myself), value the creation of AGI far more than the obtaining of material profits.
I am very interested in deriving $$ from incremental steps on the path to powerful AGI, because I think this is one of the better methods available for funding AGI R&D work.
But deriving $$ from human-level AGI really is not a big motivator of mine. To me, once human-level AGI is obtained, we have something of dramatically more interest than accumulation of any amount of wealth.
Yes, I assume that if I succeed in creating a human-level AGI, then huge amounts of $$ for research will come my way, along with enough personal $$ to liberate me from needing to manage software development contracts or mop my own floor. That will be very nice. But that's just not the point.
I'm envisioning a population of cockroaches constantly fighting over crumbs of food on the floor. Then a few of the cockroaches -- let's call them the Cockroach Robot Club -- decide to spend their lives focused on creating a superhuman robot which will incidentally allow cockroaches to upload into superhuman form with superhuman intelligence. And the other cockroaches insist that the Cockroach Robot Club's motivation in doing this must be a desire to get more crumbs of food. After all, just **IMAGINE** how many crumbs of food you'll be able to get with that superhuman robot on your side!!! Buckets full of crumbs!!!
(Perhaps after they're resurrected and uploaded, the cockroaches that used to live in my kitchen will come to appreciate the literary inspiration they've provided me! For the near future though I'll need to draw my inspiration elsewhere as Womack Exterminators seems to have successfully vanquished the beasties with large amounts of poisonous gas. Which I can't help feeling guilty about, being a huge fan of the film Twilight of the Cockroaches ... but really, I digress...)
I'm also reminded of a meeting I was in back in 1986, when I was getting trained as a telephone salesman (one of my lamer summer jobs from my grad school days ... actually I think that summer I had given up on grad school and moved to Las Vegas with the idea of becoming a freelance philosopher ... but after a couple months of phone sales, which was necessary because freelance philosophers don't make much money, I reconsidered and went back to grad school in the fall). The trainer, a big fat scary guy who looked and sounded like a meaner version of my ninth grade social studies teacher, was giving us trainee salespeople a big speech about how everyone wanted success, and he asked us how success was defined. Someone in the class answered MONEY and the trainer congratulated him and said: "That's right, in America success means money, and you're going to learn to make a lot of it!" The class cheered (a scene that could have been straight out of Idiocracy ... "I like money!"). Feeling obnoxious (as I usually was in those days), I raised my hand and asked the trainer if Einstein was successful or not ... since Einstein hadn't been particularly rich, I noted, that seemed to me like a counterexample to the principle that had been posited regarding the equivalence of success and financial wealth in the American context. The trainer changed the subject to how the salesman is like a hammer and the customer is like a nail. (By the way I was a mediocre but not horrible phone salesman of "pens, caps and mugs with your company name on them." I had to use the name "Ben Brown" on the phone though because no one could pronounce "Goertzel." If you were a small business owner in summer 1986 and got a phone call from an annoying crap salesman named Ben Brown, it was probably the 19 year old version of me....)
3 Comments:
Even though my path has led me to the greater possibility for greater financial gain, I have recently come to the conclusion that:
1. Money is a side effect of personal and intellectual wealth.
2. Scrambling for the crumbs is what causes a lot of the pain at my place of work.
3. Adopting the mental model which values the journey over some illusion of a goal makes for much more health, happiness, and humor.
I'd much rather come home singing and laughing at the bullshit than have a massive heart attack at 50.
Funny thing is: Once I adopted this attitude, things started to go very well for me financially.
Bravo.
We often hear the Bill Gates bit about AGI being "worth ten Microsofts".
But isn't Microsoft's worth mostly due to the fact that there's only ONE of them? In other words, its monopoly value?
I applaud your priorities, and I urge you to be careful of the pressure that the profit motive of others can bring to bear on you, even if your own priorities are different. They can be extremely strong.
Don't forget us cockroaches.
Another cockroach here too. I agree with the "money as a side effect of mental wealth".
Does Mr. Gates fund AGI research himself? I would if I had a few B to spare. Hell, I'd call each prominent AGI researcher and ask "How much is more than enough? And your bank acct #, please? Ok, got it. Thank you. Have a nice day."
Post a Comment
<< Home