By popular demand, the green links have been expanded and consolidated onto a page that is accessible from the left hand navigation pages.
If you have suggestions for more links, please let me know via a comment on this post.
By popular demand, the green links have been expanded and consolidated onto a page that is accessible from the left hand navigation pages.
If you have suggestions for more links, please let me know via a comment on this post.
The sad reality of cloud computing right now is that the concept of resource virtualization is currently very primitive. Almost everyone on the cloud computing bandwagon seems to be fixated on the merits of aggregating the supply of primitive resources (memory, byte buckets, CPU).
In a step backwards to use these primitive resources, the early cloud computing groupthink revolves around rewriting the I/O layer so you can hide keep your data in virtual buckets. It’s astonishing that people actually believe that the Simple Storage Service is an innovation just because Amazon has made byte buckets available on demand.
Or that SimpleDB is a breakthrough because they slapped an ISAM index on buckets and have a non-standard SQL-like interpreter (with little query optimization) to make it easier to use.
Or that everyone seems to think that Cloudfront is going to redefine content distribution networks because it offers content distribution without cache coherency.
Yawn. Welcome back to the early 1980s.
Clearly the industry can do better and Microsoft’s recent Azure announcement certainly raises the bar in a way that might wake up some other interesting players.
For example, Oracle appears to have been on the Cloud Computing (CC) sidelines because of their belief that CC is fashionable gibberish. Looking at Amazon and others it is easy to understand this point of view. However Azure changes the playing field substantially.
In true MSFT fashion, Azure consists of many things – a clear sign that MSFT is serious about winning market share in the cloud. Some of the more interesting elements include
While Amazon relies on revolution and using EC2 necessitates a re-write of the I/O layer in most applications, Microsoft is promising smooth transition and evolution of many existing applications into the cloud.
It’s not hard to imagine which will be more appealing to all those IT departments with more work than people! If MSFT can acutally make Azure work as advertised sooner rather than later, Amazon will rapidly resemble the former Netscape (anyone remember them?).
However the battle is far from over.
Although there are indeed benefits from aggregating primitive resources such as CPU and storage, the economics of this are entirely based on economies of scale. Cloud computing solutions that offer undifferentiated access to commodity resources will compete solely on cost – not on value.
Small wonder that the early players are those with excess Internet-accessible CPU and storage. Amazon has already invested in large computer farms to run its online retail business, so why not sell it’s excess capacity at marginal cost via EC2?
Doing real work on the cloud, however, requires using higher level resources such as websites, databases, and other middleware. Social networking sites such as Facebook and various blogsites, already offer higher level resources (such as websites and media storage management) on a cloud basis.
Offering the necessary APIs and tools to provide access to higher level resources so that custom applications can be migrated to run on the cloud is a natual step forward – as demonstrated by Azure’s easy support for migrating entire SQL Server databases into the cloud and by Oracle’s support for live migration into Oracle Virtual Server.
Although there are a greater variety of higher level resources, ultimately this approach is still a supply-side resource aggregation play where the service is sold at its marginal cost. Small wonder that Oracle offers it’s enabling virutal server for free.
Inevitably Oracle (or another established player) will enter the game with an equally competitive solution as MSFT is promising. For example, Citrix is already offering a cloud product suite that is far more compelling than Amazon’s.
The only difference right now is that Oracle and Citrix are deciding to offer cloud-enabling software products and not cloud services, while Microsoft and Amazon are offering their cloud-enabled products to users of their proprietary cloud service.
The real end game will be fought over how well these competing cloud solutions can dynamically match the massive supply of resources to real-time workload demand.
From prior experience in creating and working with massively scalable grid computing products (such as Platform Computing, DataSynapse, Globus, etc.), dynamic resource management is far from trivial. It involves managing both the demand-side as well as the supply-side of the virtualized environment.
On the demand-side it requires rich scheduling of millions of requests for complex resources in seconds. Workload is specified based on the resources required to complete it (e.g. this version of O/S, that version of database, etc.) in the form of Jobs or Transactions (depending on the type of workload). Scheduling is based on matching resource requirements to resource availability in real time.
On the supply-side vituralization is used to segregate resource consumers and it requires the dynamic re-provisioning of supply in response to shifts in workload demand for different resource types. For example, if you need more Apache and less IIS at any instant in time and the virtual pool of resources is re-provisioned accordingly by loading the necessary virtual machines on the fly.
By offering dynamic distributed resource management, cloud computing will move from supply-side cost economics to demand-side value economics.
Also in 1977, the Canadian dept of Energy, Mines, and Resources published “An Energy Strategy for Canada” that mapped out the detailed policies necessary for energy self-reliance and environmental sustainability. For example the report identified the prime importance of appropriate energy pricing to cause shifts in consumption patterns. (Today this same concept has resurfaced as “carbon taxation”. )
Other innovative policy imperatives identified in this report included:
These concepts vaulted Canada into a leadership position internationally on this topic. These ideas were widely reported and many young people, including myself, went “Green” and initiated Conserver Society practices such as recycling, re-use, repair, re-purposing, etc.
For example, the first community paper recycling project in Canada was launched by a volunteer group of Queen’s students in Kingston in the winter of 1978 (for which I have the honour of knowing and the privilege being one of those volunteers). This project was ultimately transferred to the City of Kingson and is now commonplace in virtually every municipality in Canada. The Recycling Council of Ontario was also born in 1978 as were similar recycling initiatives in Toronto and other cities.
But these initiatives were not enough to change our society into a Conserver Society. Although Canadians started re-cycling (e.g. 1/2 of the paper in Canada is currently recycled), and we did a better job of insulating our homes, we only reduced the trajectory of our energy consumption. We did not fundamentally change it.
30 years later, according to ”Key World Energy Statistics’” from the International Energy Agency 2006, North America with 5% of the world’s population consumes 33% of the world’s resources.
Notice Canada’s embarassing position at the far right of the graph. Nobody else consumes as much energy as we do per capita!
If every other person on the planet consumed on a per person basis as much as we do, we’d need another 3 planets to provide for that consumption!
This consumption translates directly into greenhouse gas emissions. The chart below shows the outrageous per capita CO2 emissions of Canada and the USA relative to the rest of the world. Thanks to our resource-intensive industries, we rank along with Australia and Saudi Arabia among the world’s worst polluters. Notice that most European countries are significantly less egregious emitters of CO2.
Our recent election proved that politicians to not lead change, they respond to popular opinion. We must change popular opinion first and become embarassed enough to get out of our comfy chairs.
Over 279,500 homeowners in the USA received a foreclosure notice in October – up 5% from September. According to RealtyTrac, a website that has found a way of profiting from this problem by becoming a clearinghouse for foreclosure-related information, by the end of this year 1/3 of all US homes that are for sale will be bank-owned properties that have been repossessed.
That represents over 1 million homes.
Can it be that all of those mortgages should never have been granted in the first place? No.
Many of these homeowners were credit worthy, but too highly leveraged, property owners. The foreclosures of the bad mortgages early during the meltdown sent all house prices down the toilet as banks sold at pennies on the dollar to raise cash. These legitimate property owners were then caught in a situation where their mortgages were considerably higher than the value of the house.
So if you have negative equity in your home, why would anyone pay it off just to make their bank wealthy?
For example, in Northern Virginia where 1 in 200 homes has been repossessed by a bank, this 2700 sq foot home is worth $700 K and has been repossessed.
Does it look like it could have ever been owned by someone who did not have a decent income? I used to live near this Great Falls location which is about 30 minutes drive from the US capitol.
In Nevada, 1 in 80 homes has been foreclosed. In Florida, the foreclosure rate is 1 in 157.
Official sources usually describe greenhouse gas emissions by the weight of the gas produced, typically in kilograms of CO2. This is strange because we normally use kilograms to refer to the amount of a solid substance (weight), and we use volume to refer to the amount of a liquid or gas (e.g. Liters).
When CO2 is in solid form it is known as “dry ice” and it sublimates into a gas at any temperature above -78.51 C. In fact liquid CO2 can only be formed at high pressures greater than 5 atmospheres. Since CO2 is normally a gas called carbon dioxide, don’t you sometimes wonder how much of the gas is actually produced and how much space it takes up?
There is no easy conversion of weight to volume because different liquids and gases have different molecular densities. The only easy exception is that proves this rule is that 1 liter of water weighs 1 kilogram at sea level (the original definition for these units).
The combustion of all carbon containing fuels, such as methane (natural gas), petroleum distillates (gasoline, diesel, kerosene, propane), but also of coal and wood, will yield carbon dioxide and, in most cases, water. As an example the chemical reaction between methane and oxygen is given below.
Since coal is 60 – 80% pure carbon (depending on the “hardness” of the coal in question), burning coal is pretty much the same as burning carbon. Burning carbon in the presence of oxygen (O2) has two possible combustion reactions: C + O2 -> CO2 (i.e. carbon dioxide) and C + 0.5*O2 -> CO (i.e. carbon monoxide).
If too much carbon is present (relative to the amount of oxygen) when the combustion occurs, then CO will be produced. When sufficient oxygen is present CO2 will be produced instead of CO, unless the combustion occurs at very high temperatures such as above 800 C. Since the self ignition temperature of coal is 400 to 425 C (depending on moisture content and environmental conditions), the burning of coal is well below this threshold and CO2 is the main byproduct of the combustion.
To calculate how much CO2 is produced in the chemical reaction from burning coal, we need to calibrate the chemical formula using moles. Carbon has a molecular mass of 12.011 grams per mole and CO2 has a molecular mass of 44.009 grams per mole.
1 pound of coal contains 453.59 g/lb x 70% = 317.5 g carbon, or 317.5 / 12 g/mol = 26.4 mols of carbon. Since the number of atoms in a chemical reaction remain unchanged, this means that buring 1 pound of coal must produce 26.4 mols of CO2. This amount of CO2 will weigh 26.4 x 44 = 1163 grams = 1.16 Kg.
We can apply the law for ideal gases, V=nRT/P, to convert from mols to liters if we know the temperature and the pressure of the gas:
Suppose the coal is burned at 415 Celsius at sea level (101.325 kPa is the average sea level barometric pressure),
V = (26.43 Mols) * (8.3145 L*kPa/Mols/K) * (415 Celsius+273.15 Kelvin) / 101.325kPa = 1492 Litres
Of course, the resulting CO2 gas won’t stay that hot and it will contract in volume as it cools. By using the same calculation with 15 degrees C (the average temperature of the earth, across all seasons and geographic areas) instead of 415 C,
V = (26.43 Mols) * (8.3145 L*kPa/Mols/K) * (15 Celsius+273.15 Kelvin) / 101.325kPa = 624.98 Litres
So 1 pound of coal cools into 625 Litres of CO2 once the gas has cooled down to ambient air temperature. The density of this gas is 26.43 mols x 44 g/mol / 625 L = 1.86 g/L.
Burning 1 long ton of coal (2240 pounds) produces 635,013 L of CO2 gas that weighs 1.18 metric tons. This volume of gas is roughly 5x the volume of the 40′ x 20′ in-ground swiming pool in my backyard. This is a pretty big pool that can easily be seen from space by the Google satellite in the centre of the photo below.
So for every ton of coal burned, we end up with 5 swimming pools worth of CO2 hanging around in the atmosphere.
We could plant a lot of trees to absorb that, but a typical mature tree typically weighs about 3 tons, just over 2 tons of that comes from carbon dioxide. Each ton of coal would need 1 tree planted and 50 years to grow to offset the carbon dioxide. But, as Ronald Reagan pointed out trees cause pollution because they also release carbon dioxide.
If this is taken into account then an average tree will absorb about 25 grams more CO2 per day than it releases. Since a typical person generates about 2 to 2.2 pounds of CO2 per day, so for the entire Earth’s population, humans alone generate about 6 to 7 million tons of CO2 per day just by breathing. Just to keep up with human CO2 output from breathing we would need about 100 billion trees – without offsetting the burning any coal!
The other major natural carbon sink is the ocean which currently absorbs approx 6 Billion Tons of CO2 per year. Although this is a very large number, studies estimate that it is already only 27% of our collective CO2 emissions worldwide.
Guess we’ll just have to find ways to burn less coal!
Canadians emit an average of 19 tons of CO2 per person per year compared to 8 in the UK and even less in Scandinavia. A cap and trade system will limit the growth of CO2 emissions but it will not really help diminish them very quickly.
This is because all cap & trade systems are based on limiting growth in emissions above their current level – forcing companies to become more emission efficient if they are to grow. If a company cannot reduce emissions faster than they intend on growing, then it must trade to get credits that another company is able to generate via “excess” reductions in their reduction program. The idea is to progressively lower the cap over time, and this clearly takes a while to bring emissions down. For example the European Union has had a cap & trade system in place for nearly 30 years but only 2 of 25 countries actually have cap limits below historical levels!
Since a cap & trade system is based on limiting the quantity of emissions, the value of the credits is largely determined by how fast the cap is reduced. If the rate of quantity reduction is too high, not enough credits can be generated to be traded and the cost of compliance soars. On the other hand if the rate of cap reduction is too slow, then the value of the credits are too low to be worth obtaining.
It is well known and widely accepted that current levels are too high. In fact Kyoto is all about reducing emissions by 6% below the 1990 level. Without effective government leadership, Canada is now running 22% above our 1990 level – a full 28% off target.
The main alternative to a cap & trade system is a carbon tax. This essentially fixes the price of compliance at a known level and corporate environmental impact planning is significantly clearer. The downside is that companies could choose to absorb the tax as a cost of doing business if it is not high enough – thereby resulting in insufficient reduction in emissions.
It is inevitable that Canada and the USA will impose a carbon tax since it is the only proven way to make any real progress on diminishing CO2 emissions. It has worked in other countries (without killing their economies) and it can work here. For example, the European Energy Agency estimates that the EU-15 has spent approx 1-2% of its GDP annually on environmental protection measures since 2001 and all those countries realized GDP growth rates equal to or higher than Canada and the USA during this decade.
The fallacy of so-called “intensity-based” targets is evident in any chart that shows whether progress is being made or not relative to Kyoto commitments. Since the USA did not sign Kyoto, only Canadian data is available from official sources as illustrated below:
So if intensity-based targets are meaningless, and if we have to do something about this intolerable situation sooner rather than later, we need a real mechanism for reduction. Carbon taxes can work and can also be used in combination with a cap & trade system. In fact in Europe, more and more countries are adding some form of carbon tax into their national policy for emission reductions as a means of accelerating compliance under the EU-wide cap & trade system.
It would seen that our civil liberties are yet again under attack by Big Brother.
The latest is the “requirement” for greater disclosure of personal information in return for not requiring Canadians to apply for a visa when visting the USA.
Here we go hunting terrorists again. This constant surveillance must end. At what point do we become those that we fignt against?
If Canada is reduced to a police state where every person is watched (aka Orwell’s 1984), what have we accomplished in “protecting” ourselves from so-called terrorism?
The real terror is Big Brother.
Yes. All the Scandinavian countries (Denmark, Norway, Sweden, Finland) as well as the UK, Netherlands, and Italy have had a carbon tax system in place for the past several years without impacting their economies. The European Energy Agency estimates that the EU-15 has spent approx 2% of its GDP annually on environmental protection measures since 2001 and all those countries realized GDP growth rates equal to or higher than Canada and the USA during this decade.
As the example graph for Denmark shows, there has been a substantial improvement in Denmark’s ability to meet Kyoto targets as a result of implementing a carbon tax system. According to the EEA, Denmark’s industry improved its CO2-intensity by 25 % in seven years from 1993–2000; the econometric analysis shows that at least 10 % resulted from the CO2 tax. The impact came about both through fuel switches and energy efficiency, each accounting for about half the CO2 reduction.
Originally published October 9, 2008 in Facebook.
The US subprime mortgage market was approximately $400 B. Nearly half of these loans were enhanced by so-called piggyback loans to help borrowers pay for the equity portion of the first mortgage. It was common for a first mortgage to cover 80% of home value (underwritten by Freddie Mac) and instead of the 20% buyer equity, the piggyback loan (usually from the same bank) would cover the difference – i.e. a second mortgage but with virtually no lien on the property.
Since the first morgage was secured by Feddie Mac, it was easy for the originating bank to sell it to other institutions, so the bank only needed to fund the piggyback portion of the loan. And since, those banks operated with 10x leverage ratios, they only had deposits & equity to cover 1/10 of those bad loans. 1/10 x 20% x $400 B = $8 B. Trouble indeed as the estimate for defaults on those mortgages has esclated from approx 20% to 40%.
The first mortgages in turn were purchased by large lenders who pooled loans of varioius quality which were sold to investment banks who in turn issued preference tranches on each pool. Top tranches would be paid out first, lower tranches would be paid out later, thereby artificially creating different instruments with different levels of credit grade. Since major banks operate with a leverage ratio of appoximately 6:1, the securitized MBS must have had a book value of approx 6 x $400 B x 80% = $2 Trillion.
At the start of the crisis in 2007 as defaults mounted on the mortgages, the market for the MBS securities started to dry up. As shown in the chart above the estimated total impact of impaired MBS securities is $160 B. The immediate problem facing banks was the rapid increase in their funding requirements when they could not securitise or otherwise distribute their loan warehouses. Banks began to hoard liquidity to meet actual and potential increases in these funding requirements, causing interbank rates to spike during August and September 2007.
Towards the end of 2007, banks began announcing substantial losses on their own holdings of structured credit products. That $2 Trillion of market value was starting to unwind. Current estimates (see below) show that the structured credit market losses are in the vacinity of $400 B (so far).
An element of counterparty credit risk began to influence interbank lending decisions. Some banks could not gain unsecured funding, amplifying their financing difficulties. As the end of the year approached, banks sought to increase their liquid asset positions, in part to strengthen the appearance of their reported balance sheets. This was a major contributing factor to the rise in London interbank offered rates (Libor) internationally in early December. This was alleviated to some extent by co-ordinated central bank action on 12 December 2007 causing money market conditions to improve during January 2008.
In February and March 2008, however, money markets tightened again as banks reported significant additional write-downs on ABS and the prospect of losses on exposures insured by monolines increased. Central banks provided a second round of co-ordinated liquidity provision on 11 March 2008.
However, by Aug/Sept 2008 not just the originating bad lenders have gone under, but also the major investment banks and large US lenders that created the MBS mess have failed (or been forced to merge). This has extended the counterparty risk substantially to international banks and the dominos continue to fall.
We are now seeing bank failures in UK, Germany, Holland, and Iceland. These will in turn extend the imprint of counterparty risk and cause even more bank hoarding. Banks hoard by declining to deal with other banks and by raising credit costs to businesses and consumers. The impact on business is growing with each passing week. Already 20% of all US car dealers are facing bankruptcy as they cannot finance their inventory due to tighter credit imposed by their banks and a drop-off in spending by consumers who can’t afford to pay the rising interest on their mortages.
Substantial interest rate cuts will be necessary to improve bank margins (making it easier for them to hoard cash) but don’t expect to see any of it in your personal credit card, car loans, or mortgages and certainly not in your business lines of credit, lease rates, etc.
Coupled with over $1 Trillion so far in bailouts in the USA and UK to guarantee the bad loans and ensure bank creditworthiness, we can expect serious degredation in the buying power of cash. Only by devaluing the purchasing power of the dollar can the central banks suck excess liquidity out and devalue the bad loans at the heart of it all.
So don’t be surprised when everything costs a lot more this time next year.
The real evil behind the current global financial crisis is not sub-prime mortgages but the unregulated growth in M3 in the USA. The M3 is the broadest measure of money supply.
In 2006 the US Federal Reserve suppressed the publication of this key monetary measure claiming it was irrelevant. However, it is still available thanks to the hard work of dedicated private economists, such as John Williams, who reverse engineer the statistic from published governement financial data.
The chart below illustrates why M3 is hardly irrelevant. It clearly shows the run-up to and the crash due to the recent credit crisis.
One of the casualties of abandoning gold as the anchor for the US currency is that it allowed the explosive growth in M3 to occur unchecked by other regulation. The investment banks were able to fabricate credit instruments (measured by M3) that were used as “money” to finance everything. Without an anchor such as gold there was no limit to the amount of money that could be generated in USD. This effectively multiplied the amount of USD in global use – a statistic that is measured only by M3.
Notice how the M2 statistic in the chart above shows business as usual leading up to the crisis, while M3 screams danger.
In order to approve the bailout plan, Congress had to raise the debt ceiling for a second time this year to a whopping $11.3 trillion. If the United States actually does hit the $11.3 trillion mark, debt will then make up more than 70% of that nation’s gross domestic product (GDP).
Meanwhile the 2008 Q3 GDP results indicate that US GDP is shrinking by 1% annually (0.3% negative change from Q2 to Q3 x 4 = 1.2%). The Canadian GDP is also shrinking at the same rate but federal government debt is only 48% of GDP.
The situation underpinning the US GDP is not very good as these Q3 / Q2 comparisons indicate that Americans have significantly reduced investments in their own economy:
The high USD is also killing American exports to the tune of -350% per quarter! This is not good for the job situation in the US which in turn will fuel continued deterioration in consumer & residential spending.
Is this just due to a temporary blip in US banking credit? No. In fact the non-borrowed reserves underpinning the entire US banking industry is negative and the situation has been getting worse each month since 2007.
In other words, the US banking industry is completely bankrupt and in aggregate is entirely propped up by borrowing from Federal Reserve (which in turn is financed by US government debt).
Is Canada in any better shape? With 80% of our GDP tied to the USA, the Canadian economy is just a lifeboat still tied to the deck of the Titanic by a very long rope. It is no coincidence that Canada is urgently exploring a free trade agreement with the EU.