2007/11/10
2007/11/07
12月課程異動
各位同學,
12月份課程稍有變動請各位同學注意!!!!
12/1將安排各位同學至"台灣土壤陳列館"參觀
12/8安排各位同學北上,參加"拯救地球‧全球同步‧在地行動‧抗暖化"的活動
請各位同學於本周六將保險資料填妥以利作業.
如果各位同學有任何的問題,請與老師聯絡.
文魯彬
02-2311-2345分機302
2007/10/13
10/27環境法規上課須知
內容可有以下要點
1.議題概要
2.做此議題之動機
3.調查誰曾經做過類似議題
4.此議題牽涉到哪種政府機關
5.議題相關法規
6.以往議題相關案例
10/27要上台說明
請務必準時上課
時間AM10:00-12:00
地點生316
BY
生態四B 蕭富印0912-790419
2007/10/01
課程參考資料
一條河的價值
http://zh.wildatheart.org.tw/archives/aee_eaeeeaececeeeeecec.html
有中文版及原文版,務必下載閱讀
另外同學可以複習環境基本法第2,3條,
商業生態學,綠色資本家或綠色資本主義
10/6 上課時間
10:00am-12:00pm
地點:316教室
2007/09/26
96年度第一學期"環境法規與實例討論"課程大綱
一、教學目的 |
本課程係針對有志於環境保護的同學,介紹現有相關環境保護法規,並配合案例研習,讓同學們藉由修習本門課程的機會,將環境保護實際案例與環保法規做一結合研討,使同學們日後在面對環保議題時,能將法規理論與案例實務進行有效結合運用。 |
二、主要內容 |
文老師每週針對當週主題先進行基礎理論講解,每週有2位同學要負責記錄該次上課內容並提出心得分享。 |
三、課外作業 |
1. 針對某一個可能會影響靜宜大學或沙鹿鎮的環保爭議問題,草擬一份環境訴訟計畫。 2. 參觀農委會郭鴻裕博士主持的台灣土壤陳列館。 3. 參加后里反中科基地居民舉辦的活動或會議。 |
四、成績考核 |
1. 出席率考核占學期總成績70%。 2. 同組的組員如果有人缺席且沒有事先準備問題在開次上課時可供提問者,其他組員要負責挑選某一環保問題撰寫一封信函給環保署或台中縣環保局。 |
五、教學進度 |
第一週 課程簡介及進行分組。 系統性的思考模式介紹--主要是依據Donnela Meadows所提出的體系:’ Nine Places to Intervene in a System (PLACES TO INTERVENE IN A SYSTEM By Donella H. Meadows (Whole Earth Winter 97) -http://www.wholeearthmag.com/articlebin/109.html |
第二週 課程簡介及進行分組。 系統性的思考模式介紹--主要是依據Donnela Meadows所提出的體系:’ Nine Places to Intervene in a System (PLACES TO INTERVENE IN A SYSTEM By Donella H. Meadows (Whole Earth Winter 97) -http://www.wholeearthmag.com/articlebin/109.html |
第三週: A、 GDP(國內生產毛額) v GHI(快樂指數) or GPI(和平指數); B、 介紹一條河流的價值C.討論 |
第四週: 從生態角度看環保法規總論。 |
第五週: 從生態角度看環保法規總論。 |
第六週: 從生態角度看空污法規研討。 |
第七週: 從生態角度看空污個案研討。 |
第八週: 民生用水、灌溉用水與開發計畫用水需求間的衝突。 |
第九週: 水污染。 |
第十週: 土壤問題 |
第十一週: 土壤與糧食 |
第十二週: 台灣法令對於動物棲息地的相關規定 |
第十三週: ”它者”生物多樣性 |
第十四週: 如此惡劣的環境永續性指數(ESI). |
第十五週: 討論台灣的環境永續指數排名為何如此惡劣: 1. 人口及生態足跡 2. 法規及行政規則 3. 勞工法令政策與環保爭議的連結 |
第十六週: 討論台灣的環境永續指數排名為何如此惡劣: 假民主、企業有限責任、之的權利 |
2007/05/08
關於[寶盛水族生態遊樂區環境影響說明書]之意見
1.開發寶盛水族生態遊樂區或或許(不一定)會帶來經濟上的利益,但必定會造成環境上的破壞。
2.東部原本就屬於一個環境破壞較少的地方,應該加強保護才是,不應在加以建設,帶來更多的商業、工業進駐。
3.一個自然的風景區本生就可以有人類休憩娛樂的效果,為何一定要加上人工的遊樂區,才叫休閒娛樂區。
4.「寶盛水族生態遊樂區」不應加上「生態」二字,因為他們已嚴重的影響到生態環境,雖說說明書上強調對生態的影響不大,但生態系是極為複雜的體系,只要有些微的影響,是必會造整個生態系的嚴重影響。
5.如寶盛水族生態遊樂區不幸建設成功,大量的遊客、車潮、廢棄物等破壞生態環境的因素進入,是必會造成更嚴重的環境破壞。
6.種植外來種植物,造成生態環境破壞。
請保護還剩下的自然環境區,過分的開發只會造成自然的反撲!
2007/03/14
中部科學園區第三期發展后里基地(七星農場環評記事)
中部科學園區第三期發展后里基地(七星農場環評記事)
(一)基本資料1
緣起:89年5月總統宣示落實綠色矽島之政策走向,擬於中部地區開發第三個科學工業園區,並選定台中及雲林兩基地;台中基地橫跨台中縣、市轄區土地。92年7月中科台中基地第一期,93年6月二期;台中基地附近因覓地不易,94年6月選定后里作為中科第三期開發基地
面積:一期250.75公頃;二期81.83公頃;三期后里基地,分為農場134.64 公頃,七星農場111.63公頃
計畫:后里農場引進半導體、精密機械及光電產業,七星農場則以光電產業(友達光電)為主軸。
(二)環評過程爭議點
1.位處斷層帶,並緊鄰淨水廠
后里地區有三義、屯子腳及車籠埔等三條斷層經過,其中三義與屯子腳兩斷層更呈剪刀交叉;假若發生地震,工廠倒塌或發生爆炸意外,可能會有劇毒物質、揮發性有機化物溢出、強腐蝕性的酸鹼液外洩的風險,嚴重威脅后里鄉民的生命安全、健康,大台中居民的用水。
后里農場工廠建築預定用地隔鄰即為中部地區300萬人使用之鯉魚潭浄水廠,事關中部300多萬人之飲水風險問題,因廠商估算每年將使用2-3萬噸以上有機溶劑及其他高毒性物質並排放3000噸氣體(90%回收率),這些物質沉降至淨水廠的風險很大(尤其華映公司曾有不良紀錄)。台大地理系林俊全教授於后里農場專案小組審查時擔任專家學者委員,曾表示『不應開發』;對七星農場亦認為應進入二階段環境影響評估1
2.水資源排擠(五百億水資源開發,農業、民生用水)
過去三十二年(西元1971~2002年)中部地區各河川流量變化趨勢結果顯
示枯水期多為增加、豐水期多為減少的趨勢。台中地區90年自來水系統
供應生活及工業用水約114萬噸/日,至110年成長為183萬噸/日。然而供水量卻得在多方水利資源及工程聯合調度應用之下(見下圖),才勉強得以應付。
科三期為202萬噸/日),且水源亦受濁度之影響。在考量計畫(滿足用水需求)、調配(聯合運用發揮最大潛能、靈活有效調度水資源)及備援(因應濁度、災變及枯旱之情況)供水之需要,應即刻推動增設后里淨水場(其水源可來自大安溪及大甲溪)、八寶堰及進行豐原淨水場改善。惟中科三期后里基地之用水仍需移用農業用水3。
3.廢水排放入灌溉溝渠
中科后里基地(后里農場部分)工業廢水處理後,將排入大安溪口,近期內先排放至牛稠坑溝,廢水中殘留的污染物質是否影響農田品質而影響糧食安全。長期則將不斷地累積大安溪出海口,將污染大甲及大安溪流域水質,將下游的農業、漁業發展產生不良的影響。例如竹科放流水排放的客雅溪,水質污染嚴重,連帶使得河口養殖的牡蠣重金屬、致癌物質含量過高,不適合食用。
牛稠坑溝溪有居民取水灌溉,開發單位卻以無水利會灌區模糊開發案對引用牛稠坑溪農田之影響。大甲溪下游仍有取水灌溉之事實,加以大甲溪因本案之開發,水流量將更減少,排放的廢水水質更應提高管理標準,所以應將廢水處理為合乎灌溉水質標準。
初期排放水由大甲溪入海,沿途流經大甲、大安及清水地區農田灌溉渠取水口,廢水流入農田影響面積達3000公頃以上,對於農產品安全影響甚大。同時入滲的地下水,是當地居民飲用水來源,特別是當地土層淺薄,土壤幾無過濾的功能,影響當地住民的健康。
4.有毒氣體排放
中科后里基地將引進的半導體、光電產業,製程中需要用到強酸、強鹼溶液,尤其是氫氟酸每年將排放74公噸,相當於每天203公斤。其它強酸、強鹼加上揮發性有機物質每年允許溢散量高達3,318公噸,每日飄出的廢氣由后里的風頭吹向后里鄉人口集中的上后里、下后里地區;未來還有七星農場的開發加上焚化爐及豐興鐵工廠等廢氣排放,對於后里鄉居民長期健康風險影響甚大。
近日更傳出由於中科大雅基地廠商排放,致使鄰近東海大學空氣中有致癌物質濃度偏高的的現象。
5.使用原物料部分未明,有毒排放無法有效掌控,亦未能確實進行健康風險評估
對於製程中使用的原物料,開發單位始終無能責成預定進駐的廠商確實提供,環評委員屢次追問,最後卻以百分之五的原料因屬『商業機密』,連廠商都無法自國外供應商取得回覆。
環保署環檢所曾對新竹地區光電廠廢水檢驗,雖符合放流水標準但其生物毒性單位超出100,意即排放廢水雖符合放流水標準但將廢水稀釋100倍後,養魚仍會立即死亡1。環保單位對於有毒物質排放無法有效掌握,難以制定有效的管理策略;另者,健康風險評估團對在進行評估之際,亦因此無法確切執行。
6.科學園區作業基金虧損嚴重,卻繼續補貼高風險產業(面板業之產業風險及金融風險)
為了配合政策開發科學園區並扶植園區廠商 行政院國家科學委員會科學園區作業基金截至目前已經負債九千多億元 廠商取得大量優惠甘冒市場需求多變的風險 再加上政府要求各行庫提高其聯貸額度,這些潛在不利因子所衍生的金融風險卻是全民共同承擔 。
7.模擬、說明臭氧影響,使用背景值為陳舊資料
環境影響說明書引用環保署89年公布的當地臭氧濃度資料充當背景值,無視委員一再要求應以更新之92年,甚至95年之最新數據;原因實為一旦採用較近年資料,則臭氧濃度將嚴重超過標準。
8.觀光產業(后豐鐵馬道)
后豐鐵馬道已是全國聞名的景觀休閒道路,網頁搜尋以可多達數十頁 。一些民眾上網張貼圖片表現目前的景觀現況,開發單位對於開發案是否影響后豐鐵馬道並未盡到詳實的分析,也未提出具體可行之替代方案
開發案對后豐鐵馬道影響而影響當地居民賴以觀光為生的生活沒有評估。同時,聯絡道與后豐鐵馬道交錯,圖說之鐵馬道仍與聯絡道交錯,后豐鐵馬道現行之綠帶可否列入園區之綠帶。開發單位應儘量將空間留給大眾而不是將空間保留給開發單位。
9.通過環評之142次委員會,嚴重瑕疵違背程序正義
主席基於『行政效率』考量,非但沒有允許遠道而來的后里居民代表從容表達意見,亦限制發掘問題的環評委員陳述理由。最後更以多數官派委員投票表決,在前提全未釐清的情形下強行通過本案。
(三)環評之外其他質疑
(1)利益輸送單一廠商
七星農場規劃當時,政府必須投入至少86億元僅為單一TFT-LCD廠商,友達光電;此舉引起多位環評委員及立法委員質疑開發單位圖利廠商,有違公平正義。近日前報載,友達因市場需求前景看淡, 已決定縮減設廠規模,投資由4000億減至2540億;七星園區將重新規劃配置2。
(2)環評結論執行(后里農場之例)
后里農場已進行開發,開發單位卻未依照環境影響說明書之環境監測事項確實執行,致發生超時施工,嚴重揚塵、材料堆置交通要道致侵犯正常行駛車輛路權等重大違規情事。
(3)放流管埋設(明管或暗管)
開發單位未善盡管控園區進駐廠商之責任於先,進而未與居民充分溝通於後的情況下,引發當地大規模的居民抗爭。
(4)海岸生態(中華白海豚、高美濕地)
台灣中部西海岸存在一群以離岸不遠距離為棲地的中華白海豚族群,由於研究人員所觀察到的數量估計僅為50至200隻;排放至西海岸的工業污染將威脅此種群的生存。
大甲溪排放廢水將影響高美濕地(灰藍色區)的生態。
2007/03/10 上課資料(3) --- Dancing with Systems
What to do when systems resist change; an excerpt from Donella Meadows's unfinished last book.
By Donella Meadows
(Whole Earth Winter 2001)
People who are raised in the industrial world and who get enthused about systems thinking are likely to make a terrible mistake. They are likely to assume that here, in systems analysis, in interconnection and complication, in the power of the computer, here at last, is the key to prediction and control. This mistake is likely because the mindset of the industrial world assumes that there is a key to prediction and control.
I assumed that at first too. We all assumed it, as eager systems students at the great institution called MIT. More or less innocently, enchanted by what we could see through our new lens, we did what many discoverers do. We exaggerated our own ability to change the world. We did so not with any intent to deceive others, but in the expression of our own expectations and hopes. Systems thinking for us was more than subtle, complicated mindplay. It was going to Make Systems Work.
But self-organizing, nonlinear, feedback systems are inherently unpredictable. They are not controllable. They are understandable only in the most general way. The goal of foreseeing the future exactly and preparing for it perfectly is unrealizable. The idea of making a complex system do just what you want it to do can be achieved only temporarily, at best. We can never fully understand our world, not in the way our reductionistic science has led us to expect. Our science itself, from quantum theory to the mathematics of chaos, leads us into irreducible uncertainty. For any objective other than the most trivial, we can't optimize; we don't even know what to optimize. We can't keep track of everything. We can't find a proper, sustainable relationship to nature, each other, or the institutions we create, if we try to do it from the role of omniscient conqueror.
For those who stake their identity on the role of omniscient conqueror, the uncertainty exposed by systems thinking is hard to take. If you can't understand, predict, and control, what is there to do?
Systems thinking leads to another conclusion, however—waiting, shining, obvious as soon as we stop being blinded by the illusion of control. It says that there is plenty to do, of a different sort of "doing." The future can't be predicted, but it can be envisioned and brought lovingly into being. Systems can't be controlled, but they can be designed and redesigned. We can't surge forward with certainty into a world of no surprises, but we can expect surprises and learn from them and even profit from them. We can't impose our will upon a system. We can listen to what the system tells us, and discover how its properties and our values can work together to bring forth something much better than could ever be produced by our will alone.
We can't control systems or figure them out. But we can dance with them! I already knew that, in a way before I began to study systems. I had learned about dancing with great powers from whitewater kayaking, from gardening, from playing music, from skiing. All those endeavors require one to stay wide awake, pay close attention, participate flat out, and respond to feedback. It had never occurred to me that those same requirements might apply to intellectual work, to management, to government, to getting along with people.
But there it was, the message emerging from every computer model we made. Living successfully in a world of systems requires more of us than our ability to calculate. It requires our full humanity—our rationality, our ability to sort out truth from falsehood, our intuition, our compassion, our vision, and our morality.
I will summarize the most general "systems wisdoms" I have absorbed from modeling complex systems and hanging out with modelers. These are the take-home lessons, the concepts and practices that penetrate the discipline of systems so deeply that one begins, however imperfectly, to practice them not just in one's profession, but in all of life.
The list probably isn't complete, because I am still a student in the school of systems. And it isn't unique to systems thinking. There are many ways to learn to dance. But here, as a start-off dancing lesson, are the practices I see my colleagues adopting, consciously or unconsciously, as they encounter systems.
Get the beat.
Before you disturb the system in any way, watch how it behaves. If it's a piece of music or a whitewater rapid or a fluctuation in a commodity price, study its beat. If it's a social system, watch it work. Learn its history. Ask people who've been around a long time to tell you what has happened. If possible, find or make a time graph of actual data from the system. Peoples' memories are not always reliable when it comes to timing.
Starting with the behavior of the system forces you to focus on facts, not theories. It keeps you from falling too quickly into your own beliefs or misconceptions, or those of others. It's amazing how many misconceptions there can be. People will swear that rainfall is decreasing, say, but when you look at the data, you find that what is really happening is that variability is increasing—the droughts are deeper, but the floods are greater too. I have been told with great authority that milk price was going up when it was going down, that real interest rates were falling when they were rising, that the deficit was a higher fraction of the GNP than ever before when it wasn't.
Starting with the behavior of the system directs one's thoughts to dynamic, not static analysis—not only to "what's wrong?" but also to "how did we get there?" and "what behavior modes are possible?" and "if we don't change direction, where are we going to end up?"
And finally, starting with history discourages the common and distracting tendency we all have to define a problem not by the system's actual behavior, but by the lack of our favorite solution. (The problem is, we need to find more oil. The problem is, we need to ban abortion. The problem is, how can we attract more growth to this town?)
Listen to the wisdom of the system.
Aid and encourage the forces and structures that help the system run itself. Don't be an unthinking intervener and destroy the system's own self-maintenance capacities. Before you charge in to make things better, pay attention to the value of what's already there.
A friend of mine, Nathan Gray, was once an aid worker in Guatemala. He told me of his frustration with agencies that would arrive with the intention of "creating jobs" and "increasing entrepreneurial abilities" and "attracting outside investors." They would walk right past the thriving local market, where small-scale business people of all kinds, from basket-makers to vegetable growers to butchers to candy sellers, were displaying their entrepreneurial abilities in jobs they had created for themselves. Nathan spent his time talking to the people in the market, asking about their lives and businesses, learning what was in the way of those businesses expanding and incomes rising. He concluded that what was needed was not outside investors, but inside ones. Small loans available at reasonable interest rates, and classes in literacy and accounting, would produce much more long-term good for the community than bringing in a factory or assembly plant from outside.
Expose your mental models to the open air.
Remember, always, that everything you know, and everything everyone knows, is only a model. Get your model out there where it can be shot at. Invite others to challenge your assumptions and add their own. Instead of becoming a champion for one possible explanation or hypothesis or model, collect as many as possible. Consider all of them plausible until you find some evidence that causes you to rule one out. That way you will be emotionally able to see the evidence that rules out an assumption with which you might have confused your own identity.
You don't have to put forth your mental model with diagrams and equations, though that's a good discipline. You can do it with words or lists or pictures or arrows showing what you think is connected to what. The more you do that, in any form, the clearer your thinking will become, the faster you will admit your uncertainties and correct your mistakes, and the more flexible you will learn to be. Mental flexibility—the willingness to redraw boundaries, to notice that a system has shifted into a new mode, to see how to redesign structure—is a necessity when you live in a world of flexible systems.
Stay humble. Stay a learner.
Systems thinking has taught me to trust my intuition more and my figuring-out rationality less, to lean on both as much as I can, but still to be prepared for surprises. Working with systems, on the computer, in nature, among people, in organizations, constantly reminds me of how incomplete my mental models are, how complex the world is, and how much I don't know.
The thing to do, when you don't know, is not to bluff and not to freeze, but to learn. The way you learn is by experiment—or, as Buckminster Fuller put it, by trial and error, error, error. In a world of complex systems it is not appropriate to charge forward with rigid, undeviating directives. "Stay the course" is only a good idea if you're sure you're on course. Pretending you're in control even when you aren't is a recipe not only for mistakes, but for not learning from mistakes. What's appropriate when you're learning is small steps, constant monitoring, and a willingness to change course as you find out more about where it's leading.
That's hard. It means making mistakes and, worse, admitting them. It means what psychologist Don Michael calls "error-embracing." It takes a lot of courage to embrace your errors.
Honor and protect information.
A decision-maker can't respond to information he or she doesn't have, can't respond accurately to information that is inaccurate, can't respond in a timely way to information that is late. I would guess that 99 percent of what goes wrong in systems goes wrong because of faulty or missing information.
If I could, I would add an Eleventh Commandment: Thou shalt not distort, delay, or sequester information. You can drive a system crazy by muddying its information streams. You can make a system work better with surprising ease if you can give it more timely, accurate, and complete information.
For example, in 1986 new federal legislation required US companies to report all chemical emissions from each of their plants. Through the Freedom of Information Act (from a systems point of view one of the most important laws in the nation) that information became a matter of public record. In July 1988 the first data on chemical emissions became available. The reported emissions were not illegal, but they didn't look very good when they were published in local papers by enterprising reporters, who had a tendency to make lists of "the top ten local polluters." That's all that happened. There were no lawsuits, no required reductions, no fines, no penalties. But within two years chemical emissions nationwide (as least as reported, and presumably also in fact) had decreased by 40 percent. Some companies were launching policies to bring their emissions down by 90 percent, just because of the release of previously sequestered information.
Locate responsibility in the system.
Look for the ways the system creates its own behavior. Do pay attention to the triggering events, the outside influences that bring forth one kind of behavior from the system rather than another. Sometimes those outside events can be controlled (as in reducing the pathogens in drinking water to keep down incidences of infectious disease). But sometimes they can't. And sometimes blaming or trying to control the outside influence blinds one to the easier task of increasing responsibility within the system.
"Intrinsic responsibility" means that the system is designed to send feedback about the consequences of decision-making directly and quickly and compellingly to the decision-makers.
Dartmouth College reduced intrinsic responsibility when it took thermostats out of individual offices and classrooms and put temperature-control decisions under the guidance of a central computer. That was done as an energy-saving measure. My observation from a low level in the hierarchy is that the main consequence was greater oscillations in room temperature. When my office gets overheated now, instead of turning down the thermostat, I have to call an office across campus, which gets around to making corrections over a period of hours or days, and which often overcorrects, setting up the need for another phone call. One way of making that system more, rather than less, responsible, might have been to let professors keep control of their own thermostats and charge them directly for the amount of energy they use. (Thereby privatizing a commons!)
Designing a system for intrinsic responsibility could mean, for example, requiring all towns or companies that emit wastewater into a stream to place their intake pipe downstream from their outflow pipe. It could mean that neither insurance companies nor public funds should pay for medical costs resulting from smoking or from accidents in which a motorcycle rider didn't wear a helmet or a car rider didn't fasten the seat belt. It could mean Congress would no longer be allowed to legislate rules from which it exempts itself.
Make feedback policies for feedback systems.
President Jimmy Carter had an unusual ability to think in feedback terms and to make feedback policies. Unfortunately he had a hard time explaining them to a press and public that didn't understand feedback.
He suggested, at a time when oil imports were soaring, that there be a tax on gasoline proportional to the fraction of US oil consumption that had to be imported. If imports continued to rise the tax would rise, until it suppressed demand and brought forth substitutes and reduced imports. If imports fell to zero, the tax would fall to zero.
The tax never got passed.
Carter was also trying to deal with a flood of illegal immigrants from Mexico. He suggested that nothing could be done about that immigration as long as there was a great gap in opportunity and living standards between the US and Mexico. Rather than spending money on border guards and barriers, he said, we should spend money helping to build the Mexican economy, and we should continue to do so until the immigration stopped.
That never happened either.
You can imagine why a dynamic, self-adjusting system cannot be governed by a static, unbending policy. It's easier, more effective, and usually much cheaper to design policies that change depending on the state of the system. Especially where there are great uncertainties, the best policies not only contain feedback loops, but meta-feedback loops—loops that alter, correct, and expand loops. These are policies that design learning into the management process.
Pay attention to what is important, not just what is quantifiable.
Our culture, obsessed with numbers, has given us the idea that what we can measure is more important than what we can't measure. You can look around and make up your own mind about whether quantity or quality is the outstanding characteristic of the world in which you live.
If something is ugly, say so. If it is tacky, inappropriate, out of proportion, unsustainable, morally degrading, ecologically impoverishing, or humanly demeaning, don't let it pass. Don't be stopped by the "if you can't define it and measure it, I don't have to pay attention to it" ploy. No one can [precisely] define or measure justice, democracy, security, freedom, truth, or love. No one can [precisely] define or measure any value. But if no one speaks up for them, if systems aren't designed to produce them, if we don't speak about them and point toward their presence or absence, they will cease to exist.
Go for the good of the whole.
Don't maximize parts of systems or subsystems while ignoring the whole. As Kenneth Boulding once said, don't go to great trouble to optimize something that never should be done at all. Aim to enhance total systems properties, such as [creativity], stability, diversity, resilience, and sustainability—whether they are easily measured or not.
As you think about a system, spend part of your time from a vantage point that lets you see the whole system, not just the problem that may have drawn you to focus on the system to begin with. And realize that, especially in the short term, changes for the good of the whole may sometimes seem to be counter to the interests of a part of the system. It helps to remember that the parts of a system cannot survive without the whole. The long-term interests of your liver require the long-term health of your body, and the long-term interests of sawmills require the long-term health of forests.
Expand time horizons.
The official time horizon of industrial society doesn't extend beyond what will happen after the next election or beyond the payback period of current investments. The time horizon of most families still extends farther than that—through the lifetimes of children or grandchildren. Many Native American cultures actively spoke of and considered in their decisions the effects upon the seventh generation to come. The longer the operant time horizon, the better the chances for survival.
In the strict systems sense there is no long-term/short-term distinction. Phenomena at different timescales are nested within each other. Actions taken now have some immediate effects and some that radiate out for decades to come. We experience now the consequences of actions set in motion yesterday and decades ago and centuries ago.
When you're walking along a tricky, curving, unknown, surprising, obstacle-strewn path, you'd be a fool to keep your head down and look just at the next step in front of you. You'd be equally a fool just to peer far ahead and never notice what's immediately under your feet. You need to be watching both the short and long terms—the whole system.
Expand thought horizons.
Defy the disciplines. In spite of what you majored in, or what the textbooks say, or what you think you're an expert at, follow a system wherever it leads. It will be sure to lead across traditional disciplinary lines. To understand that system, you will have to be able to learn from—while not being limited by—economists and chemists and psychologists and theologians. You will have to penetrate their jargons, integrate what they tell you, recognize what they can honestly see through their particular lenses, and discard the distortions that come from the narrowness and incompleteness of their lenses. They won't make it easy for you.
Seeing systems whole requires more than being "interdisciplinary," if that word means, as it usually does, putting together people from different disciplines and letting them talk past each other. Interdisciplinary communication works only if there is a real problem to be solved, and if the representatives from the various disciplines are more committed to solving the problem than to being academically correct. They will have to go into learning mode, to admit ignorance and be willing to be taught, by each other and by the system.
It can be done. It's very exciting when it happens.
Expand the boundary of caring.
Living successfully in a world of complex systems means expanding not only time horizons and thought horizons; above all it means expanding the horizons of caring. There are moral reasons for doing that, of course. And if moral arguments are not sufficient, systems thinking provides the practical reasons to back up the moral ones. The real system is interconnected. No part of the human race is separate either from other human beings or from the global ecosystem. It will not be possible in this integrated world for your heart to succeed if your lungs fail, or for your company to succeed if your workers fail, or for the rich in Los Angeles to succeed if the poor in Los Angeles fail, or for Europe to succeed if Africa fails, or for the global economy to succeed if the global environment fails.
As with everything else about systems, most people already know the interconnections that make moral and practical rules turn out to be the same rules. They just have to bring themselves to believe what they know.
Celebrate complexity.
Let's face it, the universe is messy. It is nonlinear, turbulent, and chaotic. It is dynamic. It spends its time in transient behavior on its way to somewhere else, not in mathematically neat equilibria. It self-organizes and evolves. It creates diversity, not uniformity. That's what makes the world interesting, that's what makes it beautiful, and that's what makes it work.
There's something within the human mind that is attracted to straight lines and not curves, to whole numbers and not fractions, to uniformity and not diversity, and to certainties and not mystery. But there is something else within us that has the opposite set of tendencies, since we ourselves evolved out of and are shaped by and structured as complex feedback systems. Only a part of us, a part that has emerged recently, designs buildings as boxes with uncompromising straight lines and flat surfaces. Another part of us recognizes instinctively that nature designs in fractals, with intriguing detail on every scale from the microscopic to the macroscopic. That part of us makes Gothic cathedrals and Persian carpets, symphonies and novels, Mardi Gras costumes and artificial intelligence programs, all with embellishments almost as complex as the ones we find in the world around us.
Hold fast to the goal of goodness.
Examples of bad human behavior are held up, magnified by the media, affirmed by the culture, as typical. Just what you would expect. After all, we're only human. The far more numerous examples of human goodness are barely noticed. They are Not News. They are exceptions. Must have been a saint. Can't expect everyone to behave like that.
And so expectations are lowered. The gap between desired behavior and actual behavior narrows. Fewer actions are taken to affirm and instill ideals. The public discourse is full of cynicism. Public leaders are visibly, unrepentantly, amoral or immoral and are not held to account. Idealism is ridiculed. Statements of moral belief are suspect. It is much easier to talk about hate in public than to talk about love.
We know what to do about eroding goals. Don't weigh the bad news more heavily than the good. And keep standards absolute.
This is quite a list. Systems thinking can only tell us to do these things. It can't do them for us. And so we are brought to the gap between understanding and implementation. Systems thinking by itself cannot bridge that gap. But it can lead us to the edge of what analysis can do and then point beyond—to what can and must be done by the human spirit.
This work is licensed under a Creative Commons License.
2007/03/10 上課課程資料(2) - 土壤評估在環評影響評估中的功能為何?
土壤評估在環評影響評估中的功能為何?
郭鴻裕
農委會農業試驗所
I. 現行環評之土壤審議規範
環評審查之各類環境因子
(一)物理及化學因子(包括 : 地形、地質 、地形、II. 現行環評之土壤因子
土壤 、水文、水質、氣象、 空氣品質、噪音、
震動、惡臭、 廢棄物等)。
(二)生態因子(水陸域動物、植物、棲息環淨等) 。
(三)景觀及遊憩因子(遊憩資源等) 。
(四)社會經濟因子(影響人口、產業、土地利用、
公共設施衝擊、交通衍生效應、居民意見等) 。
(五)文化因子(古蹟、遺址、歷史建築等) 。
(六)其他環境因子。
直接考量:
土壤重金屬濃度間接考量:
基地排水(滯洪池)—水土保持計畫III. 土壤(農林)的功能的各種闡釋(1)
土壤液化及崩坍—地質、地形
挖填土方—廢棄物處理
考古遺址---文化資產
陸域動植物棲地---生態
日本(環境土壤學):
植物生產機能、(水質、污染物質)淨化機能、貯水及透水機能、
埋藏文化財保存機能、和諧基能(景觀/市民農園)、自然教育/教
材機能、建築物支持機能、土地設施施工機能、建設資材機能、
窯業原料機能。
水涵養機能、洪水防止機能、水質淨化機能、土砂崩壞防止機能、IV. 土壤(農林)的功能的各種闡釋(2)
土壤侵蝕防止機能、污染物淨化機能、居住快適性機能、保健休
養機能(生理、心理)。
(註:畫底線者為目前環評中有討論到的議題)
英國農環部(defra, UK);Blum(1993)
環境交互作用(V. 土壤(農林)的功能的各種闡釋(3)
(1)空氣、地質、水及土地利用之介面;
(2)過濾水質及空氣灰塵沉降場所;
(3)釋出及吸收大氣氣體並且是碳(溫室氣體)的儲存所;
(4)降雨時調整水流及水氣)
食物及纖維之生產(食物、木材、能源、家畜及纖維)
提供平台
(1)土木之基礎(道路及建築)
(2)影響土地利用及造成地景(shaping landscape)
(註:畫底線者為目前環評中有討論到的議題)
支持生態棲地及生物多樣性:
(1)決定自然與生命的分布;
(2)陸地生態的基礎:
提供水、養分、根之成長空間、種子儲存所及微生物與大型土生動物棲地
提供原物料來源:
(1)直接提供礦物及資源如泥碳土及表土;
(2)天然的水庫儲存大量的水
保護文化資產: 保護我們的文化遺產及環境變化
(註:畫底線者為目前環評中有討論到的議題)
VI.
VII.以七星案為例:環評對土壤功能考量之缺失(1)
已考量部分
- 土壤重金屬
- 滯洪池(防洪機能, 但末端排水只能承受10年期洪峰)
- 文化遺址(評估期間沒有找到遺址)
- 土壤液化及斷層(提供平台機能,斷層有威脅但還是要開發)
- 棄土(土資場,但未考量再利用(磚材或陶土)填海…)
VIII. 以七星案為例:環評對土壤功能考量之缺失(2)
未考量部分
- 環境交互作用
- (1)空氣、地質、水及土地利用之介面;
- (2)過濾水質及空氣灰塵沉降場所;
- (3)釋出及吸收大氣氣體並且是碳(溫室氣體)的儲存所;
- (4)降雨時調整水流及水氣
- 食物及纖維之生產(食物、木材、能源、家畜及纖維)
- 提供平台
- (1)土木之基礎(道路及建築)
- (2)影響土地利用及造成地景(shaping landscape)—殘丘地景、后豐鐵馬道)
IX. 以七星案為例:環評對土壤功能考量之缺失(3)
未考量部分
- 支持生態棲地及生物多樣性:
- (1)決定自然與生命的分布;
- (2)陸地生態的基礎:提供水、養分、根之成長空間、種子儲存所及微生物與大型土生動物棲地)
- 提供原物料來源:
- (1)直接提供礦物及資源如泥碳土及表土;
- (2)天然的水庫儲存大量的水)
- 保護文化資產: (保護我們的文化遺產及環境變化-后里台地紅土之意義?)
X. 為何無法評估土壤功能?
- 面積要大至多少才有影響? (10ha以上?)
- 土壤資料的不完整性?(開發單位應補充調查)
- 土壤功能的補償措施(有價及無價之估算)
- 土壤功能無法讓一般人直接感受其重要。
- 台灣人喜歡: 以工程解決所有問題 (人口稠密之故?教育?) 以金錢衡量一切?
解決方案: 提出土壤功能評估審議規範?
2007/03/10 上課課程資料(1)- Places to intervene in a system
Places to Intervene in a System
By Donella H. Meadows
(Whole Earth Winter 1997)
Folks who do systems analysis have a great belief in "leverage points." These are places within a complex system (a corporation, an economy, a living body, a city, an ecosystem) where a small shift in one thing can produce big changes in everything.
The systems community has a lot of lore about leverage points. Those of us who were trained by the great Jay Forrester at MIT have absorbed one of his favorite stories. "People know intuitively where leverage points are. Time after time I've done an analysis of a company, and I've figured out a leverage point. Then I've gone to the company and discovered that everyone is pushing it in the wrong direction !"
The classic example of that backward intuition was Forrester's first world model. Asked by the Club of Rome to show how major global problems—poverty and hunger, environmental destruction, resource depletion, urban deterioration, unemployment—are related and how they might be solved, Forrester came out with a clear leverage point: Growth. Both population and economic growth. Growth has costs—among which are poverty and hunger, environmental destruction—the whole list of problems we are trying to solve with growth!
The world's leaders are correctly fixated on economic growth as the answer to virtually all problems, but they're pushing with all their might in the wrong direction.
Counterintuitive. That's Forrester's word to describe complex systems. The systems analysts I know have come up with no quick or easy formulas for finding leverage points. Our counterintuitions aren't that well developed. Give us a few months or years and we'll model the system and figure it out. We know from bitter experience that when we do discover the system's leverage points, hardly anybody will believe us.
Very frustrating. So one day I was sitting in a meeting about the new global trade regime, NAFTA and GATT and the World Trade Organization. The more I listened, the more I began to simmer inside. "This is a HUGE NEW SYSTEM people are inventing!" I said to myself. "They haven't the slightest idea how it will behave," myself said back to me. "It's cranking the system in the wrong direction—growth, growth at any price!! And the control measures these nice folks are talking about—small parameter adjustments, weak negative feedback loops—are PUNY!"
Suddenly, without quite knowing what was happening, I got up, marched to the flip chart, tossed over a clean page, and wrote: " Places to Intervene in a System ," followed by nine items:
9. Numbers (subsidies, taxes, standards).
8. Material stocks and flows.
7. Regulating negative feedback loops.
6. Driving positive feedback loops.
5. Information flows.
4. The rules of the system (incentives, punishment, constraints).
3. The power of self-organization.
2. The goals of the system.
1. The mindset or paradigm out of which the goals, rules, feedback structure arise.
Everyone in the meeting blinked in surprise, including me. "That's brilliant!" someone breathed. "Huh?" said someone else.
I realized that I had a lot of explaining to do.
In a minute I'll go through the list, translate the jargon, give examples and exceptions. First I want to place the list in a context of humility. What bubbled up in me that day was distilled from decades of rigorous analysis of many different kinds of systems done by many smart people. But complex systems are, well, complex. It's dangerous to generalize about them. What you are about to read is not a recipe for finding leverage points. Rather it's an invitation to think more broadly about system change.
That's why leverage points are not intuitive.
9. Numbers.
Numbers ("parameters" in systems jargon) determine how much of a discrepancy turns which faucet how fast. Maybe the faucet turns hard, so it takes a while to get the water flowing. Maybe the drain is blocked and can allow only a small flow, no matter how open it is. Maybe the faucet can deliver with the force of a fire hose. These considerations are a matter of numbers, some of which are physically locked in, but most of which are popular intervention points.
Consider the national debt. It's a negative bathtub, a money hole. The rate at which it sinks is the annual deficit. Tax income makes it rise, government expenditures make it fall. Congress and the president argue endlessly about the many parameters that open and close tax faucets and spending drains. Since those faucets and drains are connected to the voters, these are politically charged parameters. But, despite all the fireworks, and no matter which party is in charge, the money hole goes on sinking, just at different rates.
The amount of land we set aside for conservation. The minimum wage. How much we spend on AIDS research or Stealth bombers. The service charge the bank extracts from your account. All these are numbers, adjustments to faucets. So, by the way, is firing people and getting new ones. Putting different hands on the faucets may change the rate at which they turn, but if they're the same old faucets, plumbed into the same system, turned according to the same information and rules and goals, the system isn't going to change much. Bill Clinton is different from George Bush, but not all that different.
Numbers are last on my list of leverage points. Diddling with details, arranging the deck chairs on the Titanic. Probably ninety-five percent of our attention goes to numbers, but there's not a lot of power in them.
Not that parameters aren't important—they can be, especially in the short term and to the individual who's standing directly in the flow. But they RARELY CHANGE BEHAVIOR. If the system is chronically stagnant, parameter changes rarely kick-start it. If it's wildly variable, they don't usually stabilize it. If it's growing out of control, they don't brake it.
Whatever cap we put on campaign contributions, it doesn't clean up politics. The Feds fiddling with the interest rate haven't made business cycles go away. (We always forget that during upturns, and are shocked, shocked by the downturns.) Spending more on police doesn't make crime go away.
However, there are critical exceptions. Numbers become leverage points when they go into ranges that kick off one of the items higher on this list. Interest rates or birth rates control the gains around positive feedback loops. System goals are parameters that can make big differences. Sometimes a system gets onto a chaotic edge, where the tiniest change in a number can drive it from order to what appears to be wild disorder.
Probably the most common kind of critical number is the length of delay in a feedback loop. Remember that bathtub on the fourth floor I mentioned, with the water heater in the basement? I actually experienced one of those once, in an old hotel in London. It wasn't even a bathtub with buffering capacity; it was a shower. The water temperature took at least a minute to respond to my faucet twists. Guess what my shower was like. Right, oscillations from hot to cold and back to hot, punctuated with expletives. Delays in negative feedback loops cause oscillations. If you're trying to adjust a system state to your goal, but you only receive delayed information about what the system state is, you will overshoot and undershoot.
Same if your information is timely, but your response isn't. For example, it takes several years to build an electric power plant, and then that plant lasts, say, thirty years. Those delays make it impossible to build exactly the right number of plants to supply a rapidly changing demand. Even with immense effort at forecasting, almost every electricity industry in the world experiences long oscillations between overcapacity and undercapacity. A system just can't respond to short-term changes when it has long-term delays. That's why a massive central-planning system, such as the Soviet Union or General Motors, necessarily functions poorly.
A delay in a feedback process is critical RELATIVE TO RATES OF CHANGE (growth, fluctuation, decay) IN THE SYSTEM STATE THAT THE FEEDBACK LOOP IS TRYING TO CONTROL. Delays that are too short cause overreaction, oscillations amplified by the jumpiness of the response. Delays that are too long cause damped, sustained, or exploding oscillations, depending on how much too long. At the extreme they cause chaos. Delays in a system with a threshold, a danger point, a range past which irreversible damage can occur, cause overshoot and collapse.
Delay length would be a high leverage point, except for the fact that delays are not often easily changeable. Things take as long as they take. You can't do a lot about the construction time of a major piece of capital, or the maturation time of a child, or the growth rate of a forest. It's usually easier to slow down the change rate (positive feedback loops, higher on this list), so feedback delays won't cause so much trouble. Critical numbers are not nearly as common as people seem to think they are. Most systems have evolved or are designed to stay out of sensitive parameter ranges. Mostly, the numbers are not worth the sweat put into them.
8. Material stocks and flows.
The plumbing structure, the stocks and flows and their physical arrangement, can have an enormous effect on how a system operates.
When the Hungarian road system was laid out so all traffic from one side of the nation to the other had to pass through central Budapest, that determined a lot about air pollution and commuting delays that are not easily fixed by pollution control devices, traffic lights, or speed limits. The only way to fix a system that is laid out wrong is to rebuild it, if you can.
Often you can't, because physical building is a slow and expensive kind of change. Some stock-and-flow structures are just plain unchangeable.
The baby-boom swell in the US population first caused pressure on the elementary school system, then high schools and colleges, then jobs and housing, and now we're looking forward to supporting its retirement. Not much to do about it, because five-year-olds become six-year-olds, and sixty-four-year-olds become sixty-five-year-olds predictably and unstoppably. The same can be said for the lifetime of destructive CFC molecules in the ozone layer, for the rate at which contaminants get washed out of aquifers, for the fact that an inefficient car fleet takes ten to twenty years to turn over.
The possible exceptional leverage point here is in the size of stocks, or buffers. Consider a huge bathtub with slow in and outflows. Now think about a small one with fast flows. That's the difference between a lake and a river. You hear about catastrophic river floods much more often than catastrophic lake floods, because stocks that are big, relative to their flows, are more stable than small ones. A big, stabilizing stock is a buffer.
The stabilizing power of buffers is why you keep money in the bank rather than living from the flow of change through your pocket. It's why stores hold inventory instead of calling for new stock just as customers carry the old stock out the door. It's why we need to maintain more than the minimum breeding population of an endangered species. Soils in the eastern US are more sensitive to acid rain than soils in the west, because they haven't got big buffers of calcium to neutralize acid. You can often stabilize a system by increasing the capacity of a buffer. But if a buffer is too big, the system gets inflexible. It reacts too slowly. Businesses invented just-in-time inventories, because occasional vulnerability to fluctuations or screw-ups is cheaper than certain, constant inventory costs—and because small-to-vanishing inventories allow more flexible response to shifting demand.
There's leverage, sometimes magical, in changing the size of buffers. But buffers are usually physical entities, not easy to change.
The acid absorption capacity of eastern soils is not a leverage point for alleviating acid rain damage. The storage capacity of a dam is literally cast in concrete. Physical structure is crucial in a system, but the leverage point is in proper design in the first place. After the structure is built, the leverage is in understanding its limitations and bottlenecks and refraining from fluctutions or expansions that strain its capacity.
7. Regulating negative feedback loops.
Now we're beginning to move from the physical part of the system to the information and control parts, where more leverage can be found. Nature evolves negative feedback loops and humans invent them to keep system states within safe bounds.
A thermostat loop is the classic example. Its purpose is to keep the system state called "room temperature" fairly constant at a desired level. Any negative feedback loop needs a goal (the thermostat setting), a monitoring and signaling device to detect excursions from the goal (the thermostat), and a response mechanism (the furnace and/or air conditioner, fans, heat pipes, fuel, etc.).
A complex system usually has numerous negative feedback loops it can bring into play, so it can self-correct under different conditions and impacts. Some of those loops may be inactive much of the time—like the emergency cooling system in a nuclear power plant, or your ability to sweat or shiver to maintain your body temperature. One of the big mistakes we make is to strip away these emergency response mechanisms because they aren't often used and they appear to be costly. In the short term we see no effect from doing this. In the long term, we narrow the range of conditions over which the system can survive.
One of the most heartbreaking ways we do this is in encroaching on the habitats of endangered species. Another is in encroaching on our own time for rest, recreation, socialization, and meditation.
The "strength" of a negative loop—its ability to keep its appointed stock at or near its goal—depends on the combination of all its parameters and links—the accuracy and rapidity of monitoring, the quickness and power of response, the directness and size of corrective flows.
There can be leverage points here. Take markets, for example, the negative feedback systems that are all but worshiped by economists—and they can indeed be marvels of self-correction, as prices vary to keep supply and demand in balance. The more the price—the central signal to both producers and consumers—is kept clear, unambiguous, timely, and truthful, the more smoothly markets will operate. Prices that reflect full costs will tell consumers how much they can actually afford and will reward efficient producers. Companies and governments are fatally attracted to the price leverage point, of course, all of them pushing in the wrong direction with subsidies, fixes, externalities, taxes, and other forms of confusion. The REAL leverage here is to keep them from doing it. Hence anti-trust laws, truth-in-advertising laws, attempts to internalize costs (such as pollution taxes), the removal of perverse subsidies, and other ways of leveling market playing fields.
The strength of a negative feedback loop is important RELATIVE TO THE IMPACT IT IS DESIGNED TO CORRECT. If the impact increases in strength, the feedbacks have to be strengthened too.
A thermostat system may work fine on a cold winter day—but open all the windows and its corrective power will fail. Democracy worked better before the advent of the brainwashing power of centralized mass communications. Traditional controls on fishing were sufficient until radar spotting and drift nets and other technologies made it possible for a few actors to wipe out the fish. The power of big industry calls for the power of big government to hold it in check; a global economy makes necessary a global government.
Here are some other examples of strengthening negative feedback controls to improve a system's self-correcting abilities: preventive medicine, exercise, and good nutrition to bolster the body's ability to fight disease, integrated pest management to encourage natural predators of crop pests, the Freedom of Information Act to reduce government secrecy, protection for whistle blowers, impact fees, pollution taxes, and performance bonds to recapture the externalized public costs of private benefits.
6. Driving positive feedback loops.
A positive feedback loop is self-reinforcing. The more it works, the more it gains power to work some more.
The more people catch the flu, the more they infect other people. The more babies are born, the more people grow up to have babies. The more money you have in the bank, the more interest you earn, the more money you have in the bank. The more the soil erodes, the less vegetation it can support, the fewer roots and leaves to soften rain and runoff, the more soil erodes. The more high-energy neutrons in the critical mass, the more they knock into nuclei and generate more.
Positive feedback loops drive growth, explosion, erosion, and collapse in systems. A system with an unchecked positive loop ultimately will destroy itself. That's why there are so few of them.
Usually a negative loop kicks in sooner or later. The epidemic runs out of infectable people—or people take increasingly strong steps to avoid being infected. The death rate rises to equal the birth rate—or people see the consequences of unchecked population growth and have fewer babies. The soil erodes away to bedrock, and after a million years the bedrock crumbles into new soil—or people put up check dams and plant trees.
In those examples, the first outcome is what happens if the positive loop runs its course, the second is what happens if there's an intervention to reduce its power.
Reducing the gain around a positive loop—slowing the growth—is usually a more powerful leverage point in systems than strengthening negative loops, and much preferable to letting the positive loop run.
Population and economic growth rates in the world model are leverage points, because slowing them gives the many negative loops, through technology and markets and other forms of adaptation, time to function. It's the same as slowing the car when you're driving too fast, rather than calling for more responsive brakes or technical advances in steering.
The most interesting behavior that rapidly turning positive loops can trigger is chaos. This wild, unpredictable, unreplicable, and yet bounded behavior happens when a system starts changing much, much faster than its negative loops can react to it.
For example, if you keep raising the capital growth rate in the world model, eventually you get to a point where one tiny increase more will shift the economy from exponential growth to oscillation. Another nudge upward gives the oscillation a double beat. And just the tiniest further nudge sends it into chaos.
I don't expect the world economy to turn chaotic any time soon (not for that reason, anyway). That behavior occurs only in unrealistic parameter ranges, equivalent to doubling the size of the economy within a year. Real-world systems do turn chaotic, however, if something in them can grow or decline very fast. Fast-replicating bacteria or insect populations, very infectious epidemics, wild speculative bubbles in money systems, neutron fluxes in the guts of nuclear power plants. These systems are hard to control, and control must involve slowing down the positive feedbacks.
In more ordinary systems, look for leverage points around birth rates, interest rates, erosion rates, "success to the successful" loops, any place where the more you have of something, the more you have the possibility of having more.
5. Information flows.
There was this subdivision of identical houses, the story goes, except that the electric meter in some of the houses was installed in the basement and in others it was installed in the front hall, where the residents could see it constantly, going round faster or slower as they used more or less electricity. Electricity consumption was 30 percent lower in the houses where the meter was in the front hall.
Systems-heads love that story because it's an example of a high leverage point in the information structure of the system. It's not a parameter adjustment, not a strengthening or weakening of an existing loop. It's a NEW LOOP, delivering feedback to a place where it wasn't going before.
In 1986 the US government required that every factory releasing hazardous air pollutants report those emissions publicly. Suddenly everyone could find out precisely what was coming out of the smokestacks in town. There was no law against those emissions, no fines, no determination of "safe" levels, just information. But by 1990 emissions dropped 40 percent. One chemical company that found itself on the Top Ten Polluters list reduced its emissions by 90 percent, just to "get off that list."
Missing feedback is a common cause of system malfunction. Adding or rerouting information can be a powerful intervention, usually easier and cheaper than rebuilding physical structure.
The tragedy of the commons that is exhausting the world's commercial fisheries occurs because there is no feedback from the state of the fish population to the decision to invest in fishing vessels. (Contrary to economic opinion, the price of fish doesn't provide that feedback. As the fish get more scarce and hence more expensive, it becomes all the more profitable to go out and catch them. That's a perverse feedback, a positive loop that leads to collapse.)
It's important that the missing feedback be restored to the right place and in compelling form. It's not enough to inform all the users of an aquifer that the groundwater level is dropping. That could trigger a race to the bottom. It would be more effective to set a water price that rises steeply as the pumping rate exceeds the recharge rate.
Suppose taxpayers got to specify on their return forms what government services their tax payments must be spent on. (Radical democracy!) Suppose any town or company that puts a water intake pipe in a river had to put it immediately DOWNSTREAM from its own outflow pipe. Suppose any public or private official who made the decision to invest in a nuclear power plant got the waste from that plant stored on his/her lawn.
There is a systematic tendency on the part of human beings to avoid accountability for their own decisions. That's why there are so many missing feedback loops—and why this kind of leverage point is so often popular with the masses, unpopular with the powers that be, and effective, if you can get the powers that be to permit it to happen or go around them and make it happen anyway.
4. The rules of the system (incentives, punishments, constraints).
The rules of the system define its scope, boundaries, degrees of freedom. Thou shalt not kill. Everyone has the right of free speech. Contracts are to be honored. The president serves four-year terms and cannot serve more than two of them. Nine people on a team, you have to touch every base, three strikes and you're out. If you get caught robbing a bank, you go to jail.
Mikhail Gorbachev came to power in the USSR and opened information flows (glasnost) and changed the economic rules (perestroika), and look what happened.
Constitutions are strong social rules. Physical laws such as the second law of thermodynamics are absolute rules, if we understand them correctly. Laws, punishments, incentives, and informal social agreements are progressively weaker rules.
To demonstrate the power of rules, I ask my students to imagine different ones for a college. Suppose the students graded the teachers. Suppose you come to college when you want to learn something, and you leave when you've learned it. Suppose professors were hired according to their ability to solve real-world problems, rather than to publish academic papers. Suppose a class got graded as a group, instead of as individuals.
Rules change behavior. Power over rules is real power.
That's why lobbyists congregate when Congress writes laws, and why the Supreme Court, which interprets and delineates the Constitution—the rules for writing the rules—has even more power than Congress.
If you want to understand the deepest malfunctions of systems, pay attention to the rules, and to who has power over them.
That's why my systems intuition was sending off alarm bells as the new world trade system was explained to me. It is a system with rules designed by corporations, run by corporations, for the benefit of corporations. Its rules exclude almost any feedback from other sectors of society. Most of its meetings are closed to the press (no information, no feedback). It forces nations into positive loops, competing with each other to weaken environmental and social safeguards in order to attract corporate investment. It's a recipe for unleashing "success to the succesful" loops.
3. The power of self-organization.
The most stunning thing living systems can do is to change themselves utterly by creating whole new structures and behaviors. In biological systems that power is called evolution. In human economies it's called technical advance or social revolution. In systems lingo it's called self-organization.
Self-organization means changing any aspect of a system lower on this list—adding or deleting new physical structure, adding or deleting negative or positive loops or information flows or rules. The ability to self-organize is the strongest form of system resilience, the ability to survive change by changing.
The human immune system can develop responses to (some kinds of) insults it has never before encountered. The human brain can take in new information and pop out completely new thoughts.
Self-organization seems so wondrous that we tend to regard it as mysterious, miraculous. Economists often model technology as literal manna from heaven—coming from nowhere, costing nothing, increasing the productivity of an economy by some steady percent each year. For centuries people have regarded the spectacular variety of nature with the same awe. Only a divine creator could bring forth such a creation.
In fact the divine creator does not have to produce miracles. He, she, or it just has to write clever RULES FOR SELF-ORGANIZATION. These rules govern how, where, and what the system can add onto or subtract from itself under what conditions.
Self-organizing computer models demonstrate that delightful, mind-boggling patterns can evolve from simple evolutionary algorithms. (That need not mean that real-world algorithms are simple, only that they can be.) The genetic code that is the basis of all biological evolution contains just four letters, combined into words of three letters each. That code, and the rules for replicating and rearranging it, has spewed out an unimaginable variety of creatures.
Self-organization is basically a matter of evolutionary raw material—a stock of information from which to select possible patterns—and a means for testing them. For biological evolution the raw material is DNA, one source of variety is spontaneous mutation, and the testing mechanism is something like punctuated Darwinian selection. For technology the raw material is the body of understanding science has accumulated. The source of variety is human creativity (whatever THAT is) and the selection mechanism is whatever the market will reward or whatever governments and foundations will fund or whatever tickles the fancy of crazy inventors.
When you understand the power of self-organization, you begin to understand why biologists worship biodiversity even more than economists worship technology. The wildly varied stock of DNA, evolved and accumulated over billions of years, is the source of evolutionary potential, just as science libraries and labs and scientists are the source of technological potential. Allowing species to go extinct is a systems crime, just as randomly eliminating all copies of particular science journals, or particular kinds of scientists, would be.
The same could be said of human cultures, which are the store of behavioral repertoires accumulated over not billions, but hundreds of thousands of years. They are a stock out of which social evolution can arise. Unfortunately, people appreciate the evolutionary potential of cultures even less than they understand the potential of every genetic variation in ground squirrels. I guess that's because one aspect of almost every culture is a belief in the utter superiority of that culture.
Any system, biological, economic, or social, that scorns experimentation and wipes out the raw material of innovation is doomed over the long term on this highly variable planet.
The intervention point here is obvious but unpopular. Encouraging diversity means losing control. Let a thousand flowers bloom and ANYTHING could happen!
Who wants that?
2. The goals of the system.
Right there, the push for control, is an example of why the goal of a system is even more of a leverage point than the self-organizing ability of a system.
If the goal is to bring more and more of the world under the control of one central planning system (the empire of Genghis Khan, the world of Islam, the People's Republic of China, Wal-Mart, Disney), then everything further down the list, even self-organizing behavior, will be pressured or weakened to conform to that goal.
That's why I can't get into arguments about whether genetic engineering is a good or a bad thing. Like all technologies, it depends upon who is wielding it, with what goal. The only thing one can say is that if corporations wield it for the purpose of generating marketable products, that is a very different goal, a different direction for evolution than anything the planet has seen so far.
There is a hierarchy of goals in systems. Most negative feedback loops have their own goals—to keep the bath water at the right level, to keep the room temperature comfortable, to keep inventories stocked at sufficient levels. They are small leverage points. The big leverage points are the goals of entire systems.
People within systems don't often recognize what whole-system goal they are serving. To make profits, most corporations would say, but that's just a rule, a necessary condition to stay in the game. What is the point of the game? To grow, to increase market share, to bring the world (customers, suppliers, regulators) more under the control of the corporation, so that its operations become ever more shielded from uncertainty. That's the goal of a cancer cell too and of every living population. It's only a bad one when it isn't countered by higher-level negative feedback loops with goals of keeping the system in balance. The goal of keeping the market competitive has to trump the goal of each corporation to eliminate its competitors. The goal of keeping populations in balance and evolving has to trump the goal of each population to commandeer all resources into its own metabolism.
I said a while back that changing the players in a system is a low-level intervention, as long as the players fit into the same old system. The exception to that rule is at the top, if a single player can change the system's goal.
I have watched in wonder as—only very occasionally—a new leader in an organization, from Dartmouth College to Nazi Germany, comes in, enunciates a new goal, and single-handedly changes the behavior of hundreds or thousands or millions of perfectly rational people.
That's what Ronald Reagan did. Not long before he came to office, a president could say, "Ask not what government can do for you, ask what you can do for the government," and no one even laughed. Reagan said the goal is not to get the people to help the government and not to get government to help the people, but to get the government off our backs. One can argue, and I would, that larger system changes let him get away with that. But the thoroughness with which behavior in the US and even the world has been changed since Reagan is testimony to the high leverage of articulating, repeating, standing for, insisting upon new system goals.
1. The mindset or paradigm out of which the system arises.
Another of Jay Forrester's systems sayings goes: It doesn't matter how the tax law of a country is written. There is a shared idea in the minds of the society about what a "fair" distribution of the tax load is. Whatever the rules say, by fair means or foul, by complications, cheating, exemptions or deductions, by constant sniping at the rules, the actual distribution of taxes will push right up against the accepted idea of "fairness."
The shared idea in the minds of society, the great unstated assumptions—unstated because unnecessary to state; everyone knows them—constitute that society's deepest set of beliefs about how the world works. There is a difference between nouns and verbs. People who are paid less are worth less. Growth is good. Nature is a stock of resources to be converted to human purposes. Evolution stopped with the emergence of Homo sapiens . One can "own" land. Those are just a few of the paradigmatic assumptions of our culture, all of which utterly dumbfound people of other cultures.
Paradigms are the sources of systems. From them come goals, information flows, feedbacks, stocks, flows.
The ancient Egyptians built pyramids because they believed in an afterlife. We build skyscrapers, because we believe that space in downtown cities is enormously valuable. (Except for blighted spaces, often near the skyscrapers, which we believe are worthless.) Whether it was Copernicus and Kepler showing that the earth is not the center of the universe, or Einstein hypothesizing that matter and energy are interchangeable, or Adam Smith postulating that the selfish actions of individual players in markets wonderfully accumulate to the common good.
People who manage to intervene in systems at the level of paradigm hit a leverage point that totally transforms systems.
You could say paradigms are harder to change than anything else about a system, and therefore this item should be lowest on the list, not the highest. But there's nothing physical or expensive or even slow about paradigm change. In a single individual it can happen in a millisecond. All it takes is a click in the mind, a new way of seeing. Of course individuals and societies do resist challenges to their paradigm harder than they resist any other kind of change.
So how do you change paradigms? Thomas Kuhn, who wrote the seminal book about the great paradigm shifts of science, has a lot to say about that. In a nutshell, you keep pointing at the anomalies and failures in the old paradigm, you come yourself, loudly, with assurance, from the new one, you insert people with the new paradigm in places of public visibility and power. You don't waste time with reactionaries; rather you work with active change agents and with the vast middle ground of people who are open-minded.
Systems folks would say one way to change a paradigm is to model a system, which takes you outside the system and forces you to see it whole. We say that because our own paradigms have been changed that way.
0. The power to transcend paradigms.
Sorry, but to be truthful and complete, I have to add this kicker.
The highest leverage of all is to keep oneself unattached in the arena of paradigms, to realize that NO paradigm is "true," that even the one that sweetly shapes one's comfortable worldview is a tremendously limited understanding of an immense and amazing universe.
It is to "get" at a gut level the paradigm that there are paradigms, and to see that that itself is a paradigm, and to regard that whole realization as devastatingly funny. It is to let go into Not Knowing.
People who cling to paradigms (just about all of us) take one look at the spacious possibility that everything we think is guaranteed to be nonsense and pedal rapidly in the opposite direction. Surely there is no power, no control, not even a reason for being, much less acting, in the experience that there is no certainty in any worldview. But everyone who has managed to entertain that idea, for a moment or for a lifetime, has found it a basis for radical empowerment. If no paradigm is right, you can choose one that will help achieve your purpose. If you have no idea where to get a purpose, you can listen to the universe (or put in the name of your favorite deity here) and do his, her, its will, which is a lot better informed than your will.
It is in the space of mastery over paradigms that people throw off addictions, live in constant joy, bring down empires, get locked up or burned at the stake or crucified or shot, and have impacts that last for millennia.
Back from the sublime to the ridiculous, from enlightenment to caveats. There is so much that has to be said to qualify this list. It is tentative and its order is slithery. There are exceptions to every item on it. Having the list percolating in my subconscious for years has not transformed me into a Superwoman. I seem to spend my time running up and down the list, trying out leverage points wherever I can find them. The higher the leverage point, the more the system resists changing it-that's why societies rub out truly enlightened beings.
I don't think there are cheap tickets to system change. You have to work at it, whether that means rigorously analyzing a system or rigorously casting off paradigms. In the end, it seems that leverage has less to do with pushing levers than it does with disciplined thinking combined with strategically, profoundly, madly letting go.
This work is licensed under a Creative Commons License.
2007/03/13
blog 系統新增修改
2007/03/12
2007/03/31 上課課程資料 --- "大台中地區生態地理環境概況"
I. 自然災害對大甲溪生態環境的衝擊
自然災害對大甲溪生態環境的衝擊
楊國禎
靜宜大學生態學研究所
西太平洋與亞洲陸地交接的邊界上,海底的菲律賓海板塊以每年約7公
壹、環境概況
一、河流地形概況
大甲溪流域分成三段,上游環形集水區、中游穿越山脈峽谷區
1. 上游環形集水區
上游有勝溪(舊稱伊卡丸溪)自思源埡口(舊稱匹亞南鞍部
這段河川發育於崇山峻嶺間,集水區流域的邊界形成環狀
在這雪山、中央山脈東西兩面的高山稜線中間,夾著各種不同高度
2. 中游穿越山脈峽谷區
由德基至天冷間切穿雪山山脈,將白狗大山與雪山主稜分開
3. 下游沖積區
過天冷後現今河流受到頭嵙山的阻擋轉向北流,過了東勢後再轉向西
大範圍劃分可區分為新社河階群、后里豐原台地面、大肚台地及溪口沖
二、氣候
- 氣溫
隨著海拔高度升高溫度降低,戚啟勳(1970)提出台灣山地溫度分
- 雨量
收集各研究資料得到大甲溪流域各測候站的雨量資料如下:梧棲站
表一、大甲溪沿岸測候站年、一月、七月均溫對照表
氣象站 | 海拔高度 | 年均溫(oC) | 一月均溫(oC) | 七月均溫(oC) | 參考資料 |
梧棲 | 平地 | 22.6 | 15.6 | 28.9 | 中央氣象局網站 |
台中 | 約50m | 22.8 | 15.7 | 28.4 | 中央氣象局網站 |
天輪測候站 | 610m | 20.2 | 13.7 | 24.6 | 陳明義等(1986) |
環山、梨山、佳陽、達見 | 1500- 2000m | 15~16 | 7.8 ~9.7 | 19.4 ~ 21.2 | 劉棠瑞、蘇鴻傑(1978) |
畢祿農業氣象站 | 2350m | 13.1 | 5.8 | 17.9 | 劉業經等(1981) |
以南投、嘉義測候所推測 | 3000m | 7.5 | 3.0 | 11.5 | 劉棠瑞、蘇鴻傑(1978) |
以南投、嘉義測候所推測 | 3800m | 4.0 | 2.0 | 8.0 | 劉棠瑞、蘇鴻傑(1978) |
玉山 | 3900m | 3.8 | -1.6 | 7.6 | 中央氣象局網站 |
山谷夜晚會有逆溫現象的發生
上升,這與地形上迎風坡面,隨海拔升高而降雨的現象一致
雨量主要集中於5、6月的梅雨季及7、8、9的颱風季
三、生態
隨著海拔高度升高,溫度降低,不同海拔高度間形成不同氣候帶效應
大甲溪流域過去的植被調查實證(徐國士等,1984;章樂民
(一)高山岩原及岩碎地生態區
海拔3500m以上地區或接近3500m之高山稜線附近
- 高山草本植物社會
主要分布雪山聖稜線、南湖山匯、中央尖山和合歡山區
- 高山矮盤灌叢社會
普遍分布雪山山脈與中央山脈。灌木層高約1-2m
(二)針葉樹林及草原生態區
海拔2500m以上山區,主要由單一樹種組成,與溫帶地區的針葉樹
- 玉山圓柏林
海拔3500m左右台灣冷杉分布終了,形成森林界線附近
- 台灣冷杉林
廣泛分布於海拔3100-3500m間形成純林,大甲溪流域於中橫
- 台灣鐵杉林
廣泛分布於海拔2500-3100m間形成純林,海拔3100m左
- 台灣雲杉林
侷限分布於全台海拔2500-3000m間山谷陰坡
- 台灣二葉松林
大甲溪流域上游是台灣二葉松林主要分布地點,主要分布1000 -3100m間乾旱的向陽坡面,極端700-3200m
- 草原
海拔2500-3500m間有大片的草原,如南湖山匯、雪山聖稜
(三)針、闊葉混淆林
大甲溪流域,海拔1000-3000m是針、闊葉混淆林的分布範圍
- 檜木林
主要分布於鞍馬山區與八仙山區,紅檜與台灣扁柏合稱檜木
- 台灣黃杉林
主要生長全台海拔1300-2200m間,溪流沿岸陡峭的岩壁上
- 台灣肖楠林
主要生長全台海拔300-1800m的範圍,區域內分布青山至達見
- 中海拔次生闊葉樹林
次生闊葉樹會形成純林的有台灣赤楊、台灣胡桃、栓皮櫟
- 櫟林
較穩定闊葉樹林並無單一的優勢樹種,分布面積廣闊
(四)楠櫧林帶
Su(1984b)劃分台灣海拔2500m以下闊葉樹林為三帶
1.楠櫧林
位於稜線或岩壁上半段者常以殼斗科植物為主,樹冠以青剛櫟
2.低海拔次生林
海拔1500m以下,崩塌地或經人為破壞、干擾後的廢棄地、路旁
- 河岸岩生植群
河岸裸露岩壁上生長著以落葉樹為主的植群,其中以櫸木量最多
(五)楠榕林帶
最接近熱帶的環境,主要為丘陵、台地、山谷、平原與海岸等區域
至於原生林主要的組成如桑科的九丁榕、幹花榕、水同木、大葉雀榕
次生植物以相思樹為大宗,血桐、朴樹、構樹、苦楝樹、土密樹
大肚台地剛廢棄不久的草地或森林常因火燒加上土壤沖刷
河床則因地形開闊而有沙洲與淤泥地,沙洲主要以甜根子草與白茅為主
海岸濕地則有蘆葦、雲林莞草、鹽地鼠尾粟、馬尼拉芝大片存在
貳、主要的自然災害對植被生態的影響
- 火災
大甲溪流域是全台林區火災最頻繁的地區,1952
火災頻繁的原因有三:
- 氣候相對乾旱
大甲溪上游環形中央低地四周為高山稜脈阻隔,水氣不易到達
- 松林為易燃植生
這一火災頻繁區主要生長台灣二葉松林,極端海拔從700到3200
- 人類活動頻繁
大甲溪上游自從中部橫貫公路開通後,政府撥大梨山地區土地給退輔會
海拔3100m以上地區,玉山箭竹則是火災的適存植物
- 崩塌
在板塊的擠壓下,地質年輕而摺皺的台灣島快速隆起
大甲溪上游沿著中央山脈與雪山山脈間的斷層構造線發育
在海拔1500-2500的針闊葉林混淆林中,主要由針葉樹構成第
台灣赤楊是碎石或土質崩塌地分布最廣泛、生長最迅速的次生樹種
岩壁上主要優勢的次生更新樹種則以針葉樹的台灣肖楠、台灣黃杉
如果海拔1500-2500間山區2000-3000年內沒有崩塌
參、對自然災害可以思考的方向
- 火災
氣候乾燥:大氣循環牽連龐大,目前很難以人為處理扭轉
松林:如火災減少,松林會逐漸往其他較複雜的極峰型森林類型演化
人類活動:確實、有效的管理人類在山裡與火有關的行為。
- 崩塌
在板塊的擠壓下,抬昇、侵蝕是台灣地體構造的基調,抬昇、切割
- 清除不穩定的多餘土石,迅速成為穩定邊坡,但如此則工程浩大
,應評估效益,重點實施。 - 選用生長迅速的次生樹種,如台灣赤楊、尖葉楓、台灣紅榨楓、櫸木
、裡白蔥木、阿里山千金榆、化香樹、野桐、羅氏鹽膚木、賊仔樹 ,進行植生復育,期能迅速成林,穩定邊坡,成林後再進行林下櫟林 、楠櫧林優勢常綠闊葉林樹種造林,加速演變為穩定而可自然更新的闊 葉樹林。 - 其餘見附件一、中橫搶通 首勘之後的建言(陳玉峰)
肆、參考文獻
- 王鑫,1983,台灣的地形景觀,渡假出版社。
- 王鑫,1989,雪山—大霸尖山地區地理、地形及地質景觀先期調查報告,中華民國自然生態
保育學會。 - 林朝棨,1957,台灣地形,台灣省文獻委員會。
- 徐國士、林則桐、陳玉峰、呂勝由,1984,太魯閣國家公園植物生
態資源調查報告,內政部營建署。 - 章樂民,1962,大甲溪肖楠植物群落之研究,臺灣省林業試驗所報
告第79號。 - 戚啟勳,1970,台灣山地氣溫之特徵,氣象學報16(3): 13-23。
- 郭城孟,1995,七家灣溪潛在植被之研究,內政部營建署雪霸國家
公園管理處。 - 張石角,2001,雪霸國家公園災害敏感地區921震災後調查與防
範研究,國家公園學報11(1): 41-58。 - 黃增泉、王震哲、楊國禎、黃星凡、湯維新,1987,雪山 – 大霸尖山地區植物生態資源先期調查研究報告,中華民國自然生態保育
學會。 - 陳玉峰,1991,台灣綠色傳奇,張老師出版社。
- 陳玉峰,1992,人與自然的對決,晨星出版社。
- 陳玉峰,1994,土地的苦戀,晨星出版社。
- 陳玉峰,1995,台灣植被誌(第一卷)總論與植被帶概論
。玉山出版社。303頁。 - 陳玉峰,1996,台灣植被誌(第二卷)高山植被帶與高山植物
。晨星出版社。622頁。 - 陳玉峰,1998,台灣植被誌(第三卷)亞高山冷杉林帶與高地草原
。前衛出版社。632頁。 - 陳玉峰,2001,大坑頭嵙山系植被生態調查報告
,台灣人文生態研究3(1): 111-163。 - 陳玉峰,楊國禎,1999,台灣檜木(林)歷來相關研究總評析
,台灣人文生態研究第二卷第一期,頁49-76。 - 陳玉峰,楊國禎,林笈克,梁美慧,1999,台灣檜木林之生態研究
及經營管理建議(中部及北部地區),台灣省林務局保育研究系列87 -4號。 - 楊遠坡、林則桐、呂勝由,1989,南湖大山圈谷及其附近植被之調
查,內政部營建署太魯閣國家公園管理處。 - 劉棠瑞、蘇鴻傑,1978,大甲溪上游臺灣二葉松天然林之群落組成
及相關環境因子之研究,臺大實驗林研究報告121:207 ~ 239。 - 劉業經、歐承雄、童兆雄,1981,畢祿溪集水區森林植群之研究
,中華林學季刊14(1): 1-20。 - 廖日京,2001,綠化(森林火災),台灣大學森林學系。
- 謝長富、謝宗欣、林淑梅,1989,德基水庫溫暖帶雨林之結構及演
替,臺灣省立博物館半年刊第42卷第2期 77 ~ 90。 - 謝長富,1989,中橫公路沿線植生演進之調查(一)
,行政院國家科學委員會防災科技研究報告77 - 59號。 - 謝長富、楊國禎、謝宗欣、林淑梅,1990,中橫公路沿線植生演進
之調查(二),行政院國家科學委員會防災科技研究報告78 - 63號。 - Su, 1984a. Study on the climate and vegetation types of the natural forests in Taiwan (I) analysis of the variations in climatic factors. Q. Journ. Chin. For. 17(3): 1-14.
- Su, 1984b. Study on the climate and vegetation types of the natural forests in Taiwan (II) altitudinal vegetation zones in relation to temperature gradient. Q. Journ. Chin. For. 17(4): 57-73.
附件一、中橫搶通 首勘之後的建言 2001.07.25 中國時報 ◎陳玉峰
「是石頭走的路,我們不會與之爭道;是石頭住的家
今年七月二十日,距離九二一大震差二個月滿二年,筆者應行政院重建
一、中橫若能符合總體經濟效益及安全無礙的條件下復通
中橫能否復通,第一優先考量者必須評估地體、地層、崩塌等風險
建請行政院應就中橫復通與否之整體社會成本效益,作一長程
依筆者見解,若能僅止於維持公務必要之便道暢通,讓中橫得有十年的
二、九二一重建委員會不應只是災變後的任務編組,更應肩負國土利用
建請行政院審慎考量,重建並非復舊的同義辭,而是深入檢討舊世紀欠
三、九二一大震雖然帶給台灣無比慘重的傷害,卻也創造地球科學
http://ecology.org.tw/enews/enews121.htm