The concept funneling effect implies that if you are writing something on transhumanism (or any esoteric topic), even if you manage to purge popular culture from your mental toolbox, people will still “understand” your ideas in terms of whatever is “hip”. And if you are misunderstood at the start of your writing, you will not have a chance to clarify yourself; people will automatically make the phony connection and start substituting their own experiences for your explanations. It may not be a good idea to write for the “common public”, or any group which lacks the mental building blocks to form transhumanist ideas; for if they misunderstand you once, you will probably never get to explain things properly at all.
Ja tästä hypätään hetkeksi takaisin alkulähteille:
Singularity Writing Advice
The scientist has no instincts that track whether the words and conclusions he's using are fifty inferential steps removed from the evidence the audience is previously familiar with. He knows that the layman doesn't know the conclusions; he doesn't realize that the layman doesn't know the premises. Reporting a result new to his scientific tribe, he takes one or two inferential steps backward to the common wisdom of his tribe; obviously this is a persuasive argument. The layman is not persuaded, and so the scientist thinks the layman is an idiot. And in turn the layman has no instinct that someone offering blatantly unsupported conclusions might be working fifty inferential steps away; he thinks the scientist is a lunatic. How often does a hunter-gatherer in a tribe with no written literature wind up more than two or three inferential steps away from everyone else?
This Self-Driving AI Is Learning to Drive Almost Entirely in a Virtual World
29 minuuttia sitten