Date: Wed, 20 Dec 2000 10:07:19 -0500
From: | Paul Fernhout |
pdfernhout@kurtz-fernhout.com Reply-To: unrev-II@egroups.com Organization: Kurtz-Fernhout Software |
To: | unrev-II@egroups.com |
Subject: | Is "bootstrapping" part of the problem? |
Gary,
Thanks for all the great comments.
Garold L. Johnson wrote on December 19, 2000...]
Garold L. Johnson wrote on December 19, 2000...]
Garold L. Johnson wrote on December 19, 2000...]
Another issue is that competence in science (as it is classically practiced in academia, as a certain type of inquiry based intellectual pursuit within a single domain) does not translate to competence inual human affairs. The two are not necessarily exclusive -- just different skills. (Howard Gardner's book "Multiple Intelligences" touches on this.) People are often drawn to things that reflect their personality, and various personality traits are more useful in various roles. For example, being a suspicious person will make you an excellent software debugger, but may make relationships with clients difficult.
There is a wonderful paper by the late Diana E. Forsythe....
....called "Engineering Knowledge: The Construction of Knowledge in Artificial Intelligence" [Diana Forsythe, Engineering knowledge: The construction of knowledge in artificial intelligence, Social Studies of Science 23(3), 1993, pages 445-477.] who as an anthropologist studying AI workers concludes more or less that they in general have a very narrow view of intelligence (and mind), which in part reflects their personalities. It's actually rather humorous (in an ironic way, as you see someone trained to study what people do comment on highly technical (and arrogant?) AI researchers who claim to know from naive experience what human experts do when they think on the job and how to capture that in a machine.) I don't have an online reference for the paper itself, but it is cited quite a bit.
Garold L. Johnson wrote on December 19, 2000...]
Garold L. Johnson wrote on December 19, 2000...]
I think this is the crux of why what we are doing might make sense. You put it very succinctly here. And as Margaret Mead said something like "Don't underestimate the power of small groups of committed individuals to change the world, in fact, that is the only thing that ever has."
Garold L. Johnson wrote on December 19, 2000...]
Garold L. Johnson wrote on December 19, 2000...]
Garold L. Johnson wrote on December 19, 2000...]
[Garold L. Johnson] Unfortunately, we can start that debate and expend all of our energy on it and get nothing accomplished.
You have another good point. I won't say I completely agree with it, but it would be good to see how we could turn what might otherwise be an energy absorbing thing into something productive. Perhaps creating the infrastructure to have such discussions?
I do agree it is a minefield. It almost comes down to a religious belief (values and desires).
However, my purpose in writing the original email was to prod again on a topic I brought up before (basically "bootstrapping for what") because I think it important for individuals to keep this in mind even without a joint statement of purpose.
I am not a practicing Unitarian, and this is not an attempt to convert anyone on this list, but as one of the world's most non-dogmatic religions, the UU statement of faith might be of interest as a starting point:
The inherent worth and dignity of every person;
Justice, equity and compassion in human relations;
Acceptance of one another and encouragement to spiritual growth in our congregations;
A free and responsible search for truth and meaning;
The right of conscience and the use of the democratic process within our congregations and in society at large;
The goal of world community with peace, liberty, and justice for all;
Respect for the interdependent web of all existence of which we are a part.
Again, this is not an attempt to undermine anyone's specific faith, but to point out that often one can make a set of affirmations that are general enough to be inclusive, while still being a positive statement.
I'm sure one can find similar statements in various other religious traditions which are statements of core value apart from specific dogma. My point is that when we ask "bootstrapping for what?" we should at least have a nebulous positive answer involving the worth of the human experience, rather than "just to make things go faster".
Perhaps too big a can of worms to open...
Garold L. Johnson wrote on December 19, 2000...]
Good point. However, at least in terms of what I want to do (related to letting people pick the size of organization needed for life support from village to planet) presumably one might be able to reduce the scale of some problems by addressing them in the context of smaller groups. (Eric's Cohousing-like...
Garold L. Johnson wrote on December 19, 2000...]
Garold L. Johnson wrote on December 19, 2000...]
Garold L. Johnson wrote on December 19, 2000...]
Well put. I like this statement.
Garold L. Johnson wrote on December 19, 2000...]
Perhaps the issue in a nutshell. (Even if what constitutes a way out may be different for different groups...)
Garold L. Johnson wrote on December 19, 2000...]
Well, I agree with the sentiment, but information on organic agriculture (and the problems with pesticides) has been available for a long, long time. Rodale press goes back to the 1940s.
In the 1930s biocontrols were becoming widely used (before being replaced by petrochemicals). What perhaps is newer is to see additional drawbacks to conventional agriculture (which never had the burden of proof of safety) such as breakdown components as estrogen mimics leading to developmental problems. [I was program administrator about ten years ago for the NOFA-NJ organic farm certification program.] However, if what you mean is widely of interest, then yes, interest in organic farming does seem to be following an exponential growth curve and interest is now becoming noticeable.
Garold L. Johnson wrote on December 19, 2000...]
I agree it likely can't be stopped. The issue is a matter of either directing it to positive ends, or if that can't be accomplished, using some fraction of it for positive ends and to survive the rest of its implications.
Garold L. Johnson wrote on December 19, 2000...]
3. Accepting the Politics of Meeting Human Needs. Addressing human needs (beyond designing an OHS/DKR) was one of Doug's major goals and something that occupied many presenters in the Colloquium. The colloquium needs to accept that there are effectively no technical issues requiring extensive innovation related to supporting contemporary society that are of any significant importance.
[Garold L. Johnson] This is true, but not terribly relevant, I am afraid. We have had the technological ability to carry out nearly any set of goals that we could get sufficiently widespread agreement to tackle for years. To the extent that there is hunger in the world, for example, it is held in place by governments and those in power to whom their power is all that is of importance. This is lamentable, but it is a fact. Continuing to lament it isn't going to change it. What will change it is empowering those with the will to do something more than talk about it. This is where the efforts we are discussing can have value.
I thought it important to once again bring up this issue. Some of this is a legacy of my reactions to the earliest Colloquium speakers. But nonetheless, if we start from this premise, there is enough to go around now for basic needs even if only some people will have more than that, then the focus of our efforts might shift. In the same way people keep talking about how genetically-altered crops will benefit the developing world, when land reform or other political issues may be more important... Nonetheless, the needs of the developing world are often part of the politics of getting funding for ever advancing genetic engineering of crops like "terminator" seeds or "Round-up ready" crops. My point is simply that meeting core human needs are often cited as a reason for increasing (bootstrapping) the rate of technological progress, so a clearly delineating the two seems important.
Garold L. Johnson wrote on December 19, 2000...]
Well, I think this point is subject to some debate, given the above. If we have created an ever more complex set of processes, twisted supply chains, and so forth, justified by claims that this is to meet core human needs, perhaps part of the solution is to find a way to simplify all this so core human needs can be met. I'm not saying that is necessarily possible without further innovation in organizing manufacturing technology (say, providing each village with a flexible machining center, or a $5 self-replicating food box).
I think the point you raise here is interesting, and gets at the core of justifications for "bootstrapping" as the Bootstrap Institute defines it. Still, one may question which problems are the ones requiring that level of knowledge management. For addressing the issue of people starving in Africa (or the US) I think such a technology would be nice but it not required. For addressing the issue of dealing with self-replicating machine intelligence, perhaps such tools are required.
Garold L. Johnson wrote on December 19, 2000...]
Excellent concept of a tool to help reason about values.
Garold L. Johnson wrote on December 19, 2000...]
Hopefully, one of the areas to be addressed is to make the problems more manageable. For me, this at a start comes down to asking, how self reliant can a group of 10000 people be, for example. I think this issue is worth addressing not because people ideally may want to live like that, but because the problem is more tractable than solving "world problems".
Garold L. Johnson wrote on December 19, 2000...]
[Garold L. Johnson] That is consistent with what I have been saying, but I believe that the issues for this forum are of the nature of "what factors involved in problems of the scale of human social and political interactions impact the requirements and design of the knowledge tools that we propose to build to assist in solving these problems?" That brings the effort into one of requirements elicitation in order to build an information management technology of sufficient power and scope to allow it to be used to address such problems.
Fascinatingly complex sentence starting with "what factors...". I agree with the sentiment, although as above, I may question just how large scale the systems modeled have to be (or what simplifying assumption can be made regarding things outside the system...)
Garold L. Johnson wrote on December 19, 2000...]
OK. Although, I'd go beyond this. It has been said "never attribute to malice what stupidity or incompetence can explain". That seems close to you point. But still, we must accept that decisions are made based on values. If the decision makers have values (i.e. staying in office) different than those of the people decisions are made for, then the results may not be desirable even if they are made intelligently. This is the flaw of "cost/benefit" analysis because the issue is who pays the costs and who gets the benefits.
Garold L. Johnson wrote on December 19, 2000...]
It has been said never apply to the government for a grant to become self-reliant... :-)
Garold L. Johnson wrote on December 19, 2000...]
I'm with you here. Although for me the issue is choice and fallback positions. I would have more confidence and acceptance of living in a complex society if I knew for such there were reliable (and pleasant) alternatives if that society collapsed (from economics, war, plague, etc.)
Garold L. Johnson wrote on December 19, 2000...]
Well, obviously many educators are very dedicated, but as you point out, the system (and lack of resources) makes it hard for anyone to do a good job beyond baby-sitting and readying students for a 1950s time-card work environment.
But your deeper point is that sometimes the fixes for a thing is to make it irrelevant.
Garold L. Johnson wrote on December 19, 2000...]
[Garold L. Johnson] How can we seriously expect governments to provide the solution when they are the major source of the problem?
I do think it a proper role for government to enforce the rules of a playing field. I think for example enforcing environmental regulations is a proper role of government. So too, historically, government has been involved in income redistribution and public infrastructure. I think addressing this issue of ownership, control, and equity in machine intelligences and their output is a fair role for government.
However, it is difficult to police immortal beings (corporate machine intelligences) with powers vastly beyond those of most individual people. And likewise, the governmental process can easily become part of the problem, especially as it is subverted by powerful (machine intelligence) interests.
Garold L. Johnson wrote on December 19, 2000...]
Good point. And also to quote Einstein on the dropping of the first atomic bomb: "everything has changed but our thinking..."
Garold L. Johnson wrote on December 19, 2000...]
Most of them are involved with producing goods and services which someday (soon) might be unneeded or might come from a "replicator". Not to be too Star Trek, but if nanotechnology or similar larger scale processes are capable of flexibly making on the spot most items from basic raw materials, the need for a supply chain of organizations goes away. Yet, most corporations exist to make certain goods (or related services) that fit into this supply chain.
Garold L. Johnson wrote on December 19, 2000...]
[Garold L. Johnson] We should be so lucky that they will just collapse. They will get a lot worse before that happens.
Hope not. But it might happen. Worse in what ways do you think?
Garold L. Johnson wrote on December 19, 2000...]
One possibility is:
Garold L. Johnson wrote on December 19, 2000...]
One must distinguish between "social planning" and "dictatorship" and "how things are produced". If things are mainly produced locally (replicators, supplied with little more effort than indoor plumbing for water) then social planning will be done on a very different landscape.
[Note this isn't to say we need nanotechnology, I think (hope?) reasonably efficient community level general purpose production of most things through flexible manufacturing is possible without that.]
Garold L. Johnson wrote on December 19, 2000...]
Garold L. Johnson wrote on December 19, 2000...]
[Garold L. Johnson] The human cost is high, yes. The problem remains that we haven't yet demonstrated any system that can accomplish "unsanctioned human ends" with a lower human cost.
Whatever happened in the past, we must ask ourselves what makes sense in the future given where we are now. (Personally, I think much of the success of the U.S.A. is due to the value of the land taken from the indigenous people, ocean barriers from major wars, and the stimulating environment of a mixing of cultures and immigrants, but those are other topics.)
Garold L. Johnson wrote on December 19, 2000...]
Garold L. Johnson wrote on December 19, 2000...]
There is not (not much) market for air. The market for tap water is fairly specialized. The market for love is unusual. The point -- there are essential things not managed by conventional markets. We must ask ourselves specifically what markets are now getting us, as opposed to say local on-demand production from raw materials.
Garold L. Johnson wrote on December 19, 2000...]
Garold L. Johnson wrote on December 19, 2000...]
We don't live in a pure capitalistic society in the U.S. That's one reason we pay taxes -- for the public good. Trillions of dollars of taxes each year. They should be spent efficiently for the public good.
Garold L. Johnson wrote on December 19, 2000...]
Good point. However, we must accept that supply chains on which capitalism are based may become irrelevant. Look at the (usually) much more local economy of nature. A tree lives, dies, decays, and the nutrients are recycled into other trees. (Yes, there are global material flows too of course.)
Garold L. Johnson wrote on December 19, 2000...]
Garold L. Johnson wrote on December 19, 2000...]
Agreed. However, what we call "computing" is subject to debate and generalization. When viewed as thinking, or language, or tool use, or directing others, "computing" has been going on for a long time.
Garold L. Johnson wrote on December 19, 2000...]
[Garold L. Johnson] True, but not relevant.
Sorry. This is coming from historical issues and tone set early in in the colloquium, so the relevance really is more in relation to that background.
Garold L. Johnson wrote on December 19, 2000...]
Not quite. I am advocating distinguishing between meeting basic needs and dealing with exponential growth of technology -- really to an extent two separate things, even though the first is often used to justify the second.
Garold L. Johnson wrote on December 19, 2000...]
I see your point, and generally agree. There is a story (forget the author) of a village by a river where they keep finding babied floating in baskets to them. They set up an efficient way to care for the babies and are proud of it, but never feel they have enough resources left over to go upriver and see where those babies are coming from or why.
Hopefully my previous comment (current needs vs. growth issues) makes clearer though why I brought this up.
Still, we need to be very careful for specific problems of not saying "because the political problem is so hard, I will hide my head in the technical sand". However, obviously there are situations where a political problem can be resolved by a technical innovation (need examples here -- anyone got one?)
Garold L. Johnson wrote on December 19, 2000...]
A good question. I don't have a good answer.
Most humans have circuitry in their brains that helps them function as social organisms. It has been selected over many tens of thousands of years for some basic level of cooperation and values. Most corporations have few such built-in limits, except to the extent humans are in them, and in that case, we are talking about human group behavior, which is different from human individual behavior.
Garold L. Johnson wrote on December 19, 2000...]
General agreement, although as I said above, what scale needs to be addressed (given the possibility of decentralization) is an issue.
Garold L. Johnson wrote on December 19, 2000...]
Good point. Generally, perhaps the hope is that people will do more good things than bad with these tools. Or perhaps evolutionary, the small enclaves of humans who hit on the right good things to do with these will survive, whereas the masses who continue business as usual create ever better profit-maximizing machine intelligence won't survive their Machiavellian progeny.
Garold L. Johnson wrote on December 19, 2000...]
[Garold L. Johnson] Perhaps so, but without tools that can handle problems of the level of social systems, we aren't going to fix it either.
Again, general agreement, with the caveat that scale is an issue.
Garold L. Johnson wrote on December 19, 2000...]
[Garold L. Johnson] This is exactly the sort of approach that I think has merit. The better the tools that you can have to build such a library, the more useful the result can be because of the design of the system, and the degree to which it is possible for you to accomplish this without government or corporate support.
Nice to hear that.
Garold L. Johnson wrote on December 19, 2000...]
[Garold L. Johnson] The first step is to take it seriously. The second is to investigate what can be done about it. That is what I see going on here.
Well, to an extent. Hopefully more so now.
Garold L. Johnson wrote on December 19, 2000...]
Lou Gerstner(IBM's Chairman) was recently quoted as talking about a near term e-commerce future of 10X users, 100X bandwidth, 1000X devices, and 1,000,000X data. Obviously, IBM wants to sell the infrastructure to support that. But I think the bigger picture is lost.
Note: a link for that Gerstner quote is:
Garold L. Johnson wrote on December 19, 2000...]
Garold L. Johnson wrote on December 19, 2000...]
Well, perhaps one issue as mentioned above is to create a way to have a meaningful debate on this topic (and related tools)?
Garold L. Johnson wrote on December 19, 2000...]
Garold L. Johnson wrote on December 19, 2000...]
Good points. Still, I think they remain at the very least issues we must each for ourselves have in the backs of our minds as we build tools. As Langdon Winner says in "Autonomous Technology" the greatest individual influence an innovator has is in their choice of what to innovate. Ben Franklin chose to innovate bifocals, the (American) public library, and a better stove, and we are forever blessed because of those things.
Garold L. Johnson wrote on December 19, 2000...]
Thanks again for the great comments.
Sincerely,
-Paul Fernhout
Kurtz-Fernhout Software