acroyear: (geek2)
What Is Real, Anyways?

An essay I just wrote complaining about the constant barrage of 'is it photoshopped?' that still persists in spite of years of photographers trying to explain it.  I probably won't change any of the nay-sayer's minds either, but I at least wanted to say something, and along the way, show off the different ways a single image might be seen.
acroyear: (fof earplug)
In reply to my FB post about watching the BBC Prom 2010 of Rattle conducting 3 works by Schoenberg, Webern, and Berg from their atonal period (1911-1920), and asking rhetorically again why I love that music, someone replied,
There really are some good (albeit mostly neoclassical) composers in the 20th century - Vaughn-Williams, Thompson, Copeland, Gershwin (that's about one every twenty-five years on average, but none of them worked much after 1950). the fact of the matter is that much 20th century music was "experimental" and like most experiments, it failed. I particularly remember one experimental piece that had breaking glass as the solo "instrument."
Needless to say, this got me started... )
acroyear: (don't let the)
In response to Mike the Mad's suggestion that old-school benefit pensions are a better alternative to the unreliability of market-based 401K plans, I wrote the following...

Pity the Poor Couple Who Make $250,000 Per Year (Haven't We Been Through This Before?) : Mike the Mad Biologist:
I still see (just looking at the blatantly obvious examples of the car and airline industries, and state governments) that the pension idea simply will never work anymore. The idea of a corporation paying for your retirement is a sign of "loyalty" that simply does not exist anymore, either for the worker to the company or the company to the worker. It is predicated on the idea that a company will only ever get bigger and continue to grow its profits, something the industrial world has shown is simply no longer the case. Finally, it puts retired workers directly at odds with a corporation's REAL "customers", the stockholders, over who actually gets the lions share of the profits even if that ever-increasing growth actually takes place. Given how the law benefits stockholder value when it comes to the legal responsibilities of a board and a CEO, the pension fund will always go before stockholder dividends do.

Now, the obvious "out" for all of that is to replace pension funds with retirement stock grants (my last company did something similar, replacing 401K matches with ESOP grants), but that just makes matters worse: not only is your retirement all dependent on a single company's performance far in the future beyond your time working for them, but unlike a pension fund, there is no government protection when that stock value drops down to a penny a share because the market forces simply left them behind or the CEOs screwed the pooch royally. Enron is a textbook example for that.

401K and IRA may suck, and benefit the wealthy (or at least the upper middle class) more than most others, but outside of actually increasing social security benefits, it really is all we have left.

Faith in in the success of a single company after you're no longer working for them is faith in a failed system, based on assumptions of future growth, or even future existence, that could never hold out forever...or even into next week.
acroyear: (geek)
I, Cringely » Blog Archive » The Future of Internet TV (in America) - Cringely on technology:
Now here’s the key for all the pundits who see Apple failing or faltering: you are looking in the wrong direction. It doesn’t matter how many networks are part of Hulu. In time they will probably all be there. But Hulu will remain an artifact of network labor agreements and will be vulnerable for that reason. Hulu can’t afford to PAY its way.

Follow the money.

Apple has at this moment just under $29 billion in cash and not many good ways to get a reasonable return on that money. Only Microsoft has more cash than Apple and Microsoft is being pulled in a lot more directions so Microsoft doesn’t have Apple’s flexibility.

What will Apple do with that money?

Most of it will remain unspent is my prediction, but I’m guessing we’ll shortly see $3 billion or so per year go into buying Internet rights for TV shows — not old TV shows but NEW TV shows, shows of all types.

TV production in the U.S. is approximately a $15 billion industry. An extra $3 billion thrown into that business would change its dynamics completely. Most production isn’t done by networks but by independent producers who are hungry for revenue and risk reduction. Three billion Apple dollars spread around that crowd every year would buy Internet rights for EVERY show — more than every show in fact. Whole new classes of shows would be invented, sapping talent from other parts of the industry. It would be invigorating and destabilizing at the same time. And because it is Apple — a company with real style — the new shows wouldn’t at all be crap programming. They’d be new and innovative.

And just as the artistic heart of TV shifted to cable with HBO in the 1980s, so it will shift to the Internet and Apple.

And where will be Hulu?

Nobody will care.
Actually, where iTunes can clean up is to get in the way of cable syndication and get stuff from other countries FIRST. Like, say, Dr. Who or Torchwood. If they can offer up enough money that the iTunes download (or eventual stream) comes out the day the UK is broadcast (just time-shifted to our time zone), they could get a huge chunk of the hardcore fans to pay up for it, fans who really hate waiting for some American network to decide "ok, there's enough episodes, I'll start running them" (which is the current reason why the last 2 Dr. Who's have not been seen in America yet).

Their other ace in the hole: foreign TV that cable doesn't give a crap about, TV that acknowledges that there are quality programs made in other countries in that country's *language*. If Apple opens up a market for foriegn language TV (and adapts iTunes search so you can find them), that's another huge audience of perhaps home-sick geeks from other countries that would be willing to pay to see some of the stuff their families back home have been writing about. Apple also has the ability to get into this country that which it seems the FCC would rather not: news not filtered by "edited for America" censors (a-la, BBC World News on BBC-America).

--

I left more details in comments at the site:

For the English-speaking, they could also use it to buy the rights to more British (and Canadian) programming that the cable channels either ignore, or won’t run for months and months later (e.g., Dr. Who). Many a supportive Dr. Who fan would much rather see it with their British friends on-line than wait 5 months (or more than a year as will be the case with the 2009 productions) when some cable company finally carves out a spot on their schedule. Plus there’s many more brit-coms and dramas that get produced that may never get to this side, or only get one run at some unknown time on some random channel (like a PBS station in some rural part of the world) with no promotion, so nobody saw them. Dr. Who fans in America would *love* to see other shows that David Tennant or upcoming Matt Smith have done (including one where he’s with former companion Billy Piper), but that hardcore audience is so scattered and thin it makes no sense for a cable company to carry these shows.

In short: once again, I see you (as well as Business Week) arguing for giving Americans more ways to see the same shite that the media wizards seem to think Americans want to see (and given American Idols ratings, I can somewhat understand why). But the real market potential of online is giving people what they can’t get elsewhere because it is too expensive to distribute on mass. More than half of “TV Series X season 1″ is lost to dvd production and distribution, which also requires producing more copies than you might need in order to get every store to carry it. THAT model is just insane, guaranteeing that these things end up in the $3 bin in half the country, for product that never really sells well at full price no matter where you are.

Now take something where even that horrible distribution model is unprofitable (mostly because your target audience is concentrated in city suburbs, or scattered too thinly for store distribution to make sense) and, well, it’s a no-brainer to say “never ever show this to anybody”.

Online distribution is the key to getting this material to reach the audience that’s been begging for it on every online forum on media I know about.

Now, if only the media companies can realize that just as with HiDef, there’s still an audience that would rather pay a touch more for lossless music downloads than highly compressed mp3/a4m tin box crap…

acroyear: (fof pb neverending)
In Greg Laden's Blog : Gnome vs. KDE, I wrote the following comment:
I've always been of mixed minds about these two. Now, mind you, I'm a developer and my history with X GUI's goes back to X11R4 and Motif 1.1.

From a development standpoint, each have their pros and cons

Gnome pro: multiple languages can be used.
Gnome pro: look & feel more like Motif
Gnome con: the toolkit APIs changed fairly often in non-compatible ways.
Gnome con 2: if you didn't think .NET and C# was worth it, then Mono was a huge time-sink and waste of resources.

Gnome con 3: GTK layer duplicated a TON of X11 calls for almost no reason. the great truth of programming: all abstractions leak.

On the other hand, it did *eventually* (took about 6 years) get to the point where a GTK program can run on Windows without an X11 server (Pidgin, formerly gaim), so in the long run it may have paid off to have hidden/duplicated all those pesky low-level X11 functions. Similarly, this now makes php a valid scripting language for desktop use on windows as well, rarely though I need it.

KDE pro: generally stable API
KDE con: only one language (at the time; things may have changed)
KDE con: at the time, most apps for it were very primitive

General Linux Desktop con: the administration tools are, well, incredibly inconsistent. even from subsequent releases for the same distro, I constantly find that some damned useful tool has been replaced by something next to useless, and it takes 3 more releases to find something as useful as what i had before...only to find IT disappear and be replaced by another useless tool.

So really, I don't care. I'm sick of the constant changes in the desktops' surface layers without any of them really getting to the heart of their faults, and I'm sick of the inconsistencies of things that I take for granted on Windows platforms, like guaranteed instant plug-n-play of USB hard drives (I still get surprised at distros that don't do this). At 38 with 50+ hour work-weeks and a desire to, well, read and relax and have a family and watch a movie or two and enjoy my music, I don't have the time to argue with distro testing and desktop comparing and all of that stuff that at 24 I thought was fascinating in the early years of all of this.

Yes: I was arguing with Linux distros and BSD for x86 and all of this stuff 15 years ago.

After 15 years, we should have been done with all of this by now and just gotten on with it.

This isn't "vi vs emacs" (go 'joe'!) anymore. This is two groups fundamentally wasting time making things that are 90% exactly the same and then arguing that the 10% is the end of everything. This is distros constantly changing the toolkits that administrators and developers NEED, ABSOLUTELY NEED to be stable so they can operate on rote, manage by exception, and just get on with their f'in' jobs. My job is not to figure out linux crap anymore. My job is to wrote code that works. Linux went from being a damn good tool to getting in my way more than Windows ever has, merely because every new distro changes EVERYTHING that I need to get my job done and I don't have time to relearn it all anymore.

15 years, and some of this stuff is basic and hasn't changed in a a decade or more (like, say, Apache configuration), yet every release has a brand new tool to do it and every release has every administrator just falling back on command-line and home-grown scripts because it is no longer worth their time to see if the next batch of tools will do what should have been done 15 years ago.

and THAT is why Linux will never "win" the desktop wars.

At least with Apple, Jobs created the "screw backwards compatibility, 'cause the next one will always be better" mindset up front and ran it successfully...well, until the current "5 functions, 1 button" garbage that is the new shuffle.
geeze, that got long.  sorry...guess he tapped into some pent-up anger from my struggles to find a simple linux distro for acting as a digital photo frame and video jukebox, which is all i need at home right now...
acroyear: (weirdos...)
There's nothing unique about Jim Cramer - Glenn Greenwald - Salon.com:
Jon Stewart is being widely celebrated today and Jim Cramer/CNBC widely mocked -- both rightfully so -- for Stewart's devastatingly adversarial interview of Cramer (who, just by the way, is a Marty Peretz creation). [...]

Stewart focuses on the role Cramer and CNBC played in mindlessly disseminating and uncritically amplifying the false claims from the CEOs and banks which spawned the financial crisis with their blatantly untoward and often illegal practices.

[...transcript included...Key quotes:
Cramer: I had a lot of CEOs lie to me on the show.  It's very painful. I don't have subpoena power. . . .[...]

Stewart: But what is the responsibility of the people who cover Wall Street? . . . . I'm under the assumption, and maybe this is purely ridiculous, but I'm under the assumption that you don't just take their word at face value. That you actually then go around and try to figure it out (applause).
That's the heart of the (completely justifiable) attack on Cramer and CNBC by Stewart. They would continuously put scheming CEOs on their shows, conduct completely uncritical "interviews" and allow them to spout wholesale falsehoods. And now that they're being called upon to explain why they did this, their excuse is: Well, we were lied to. What could we have done? And the obvious answer, which Stewart repeatedly expressed, is that people who claim to be "reporters" are obligated not only to provide a forum for powerful people to make claims, but also to then investigate those claims and then to inform the public if the claims are true. That's about as basic as it gets.
I note that a few months ago, I said exactly the same thing over the Republicans vs Democrats on the ANWR and other drilling, where the media continued to mindless repeat that the "Democrats Say" blah blah blah.

Dude! Joe's Jottings, Mostly Junk - media rants for the day...:
ok, this has been the news media's take on this for the last 4 days: "Democrats say" over and over and over and over again. The he said she said objectivity of the media, particularly the AP, on this (non-)vote has driving me fucking nuts.

Hey news media! I'm not paying you to tell me what "Democrats say" - I'm paying you to actually LOOK IT UP AND TELL ME IF THEY ARE LYING!!!!

You have been playing this "Democrats Say" card for 4 days, implying that they might be lying or simply wrong, and absolutely holding an uncertainty over us for something where THE REAL FACTS WOULD BE DAMN EASY TO FIND given your resources.

GO LOOK IT UP, DO SOME RESEARCH, AND TELL ME SOME TRUTH, DAMMIT!

In fact, I know that the full report would take more time than the 12 seconds you allocate to this on WTOPNews, but consider this:

If you look it up and find its true, you can stop saying "Democrats Say" every time and bring it down to 10 seconds, giving you more time to say something else! Or if its not true, you can add the line "but research shows they are wrong" - 2 more seconds to the article, but at least we can make an informed decision on it rather than relying on unsupported speculations...
Greenwald continues with a discusion about the morning of the Dick Cheney interview on Meet the Press, that very day that the Times just happened to have published the leaked "aluminium tubes" story that became the Iraq smoking gun (a story discovered later to have been leaked by Cheney's own office), about how the Times "got the phone call" and Russert didn't and defending that "he wished he got the phone call".  Bill Moyers poiniently asked (and Greenwald echoed), "why didn't you pick up the phone yourself and call someone?"

There's nothing unique about Jim Cramer - Glenn Greenwald - Salon.com:
It's fine to praise Jon Stewart for the great interview he conducted and to mock and scoff at Jim Cramer and CNBC. That's absolutely warranted. But just as was true for Judy Miller (and her still-celebrated cohort, Michael Gordon), Jim Cramer isn't an aberration. What he did and the excuses he offered are ones that are embraced as gospel to this day by most of our establishment press corps, and to know that this is true, just look at what they do and say about their roles. But at least Cramer wants to appear to be contrite for the complicit role he played in disseminating incredibly destructive and false claims from the politically powerful. That stands in stark contrast to David Gregory ["pointing out when officials are lying is 'not our role'"], Charlie Gibson ["And it is not our job to debate them; it's our job to ask the questions"], Brian Williams, David Ignatius ["journalistic rules meant we shouldn't create a debate on our own"] and most of their friends, who continue to be defiantly and pompously proud of the exact same role they play.
Why is a scandal where a politician lies about his sex life worth all the investigations and headlines in the world, enough to almost take down an administration, while a scandal where a politician (or businessman) lies about lies about torture, lies about war, lies about imprisonment of innocents, lies about financial dealings, lies about science, lies about the influence his religion has on his political decisions, lies about the very things we put him in office with our own honest trust, not worth an inch of column on page 10, or relegated to the "opinion" columns where, in spite of the evidence presented, the paper can keep an appearance of "objectivity" in place as "opinion" can't possibly be based on facts [with columnists like George Will, and the 'Post ombudsman who defended him, sadly reinforcing that stereotype].

[However, when a rampant religious homophobe is caught lying about his own homosexual practices, that's worth page 1. You don't get to claim religious (self-)righteousness to remove the rights of others while at the same time sharing their most intimate feelings and actions you so openly condemn.  Ditto those who would call out for an affair to lead to leaving office when it's the opposition, but then say "we forgive you" when it's your own party - I can ignore an affair of a public official but I will not condone hypocricy.]

It is not being "objective" when one side is lying and the other telling the truth and you know the difference but won't openly say it.  It is being a disgrace to your profession and to the First Amendment that makes what you can do even possible.  You show you hold this Constitution in contempt.

No wonder you keep losing readers and audience figures to the two men most willing to point out the bullshit, each in their own way: Stewart and Colbert.

When journalists stop repeating the lies of our leaders and instead point them out more consistently, it might, it just might, lead to a new type of government where lying isn't the order of the day.
acroyear: (weirdos...)
Thoughts from Kansas : Disco. Inst. gets insulted again:
And furthermore, when did it become a requirement of free speech that every perspective be balanced?
One could say the same thing about freedom of the press.  While the "equal time" law has its reasons for elections (the idea that candidates win based on their TV ratings, as shown in Max Headroom, is a scary one) , it was relaxed for the right reasons and really only applies to elections, not to every "controversy", manufactured or real.  Not every perspective and viewpoint is equal in terms of factual support and adherence to reality.  Post-modernism is a poor philosophy.

The scary thing is that it was Democrats who were pushing for the "fairness doctrine", which (like the Republicans and the real line-item veto) was something they pressed for while in the minority and now will likely ignore now that they're in the majority (just like Republicans insisting they wouldn't filibuster every single thing that goes through the Senate unlike the then minority Democrats, and today are now doing exactly that).

TDP - We Need a Fairness Doctrine For Media:
Sen. Dick Durbin, D-Ill., says flatly, "It's time to reinstitute the Fairness Doctrine. I have this old-fashioned attitude that when Americans hear both sides of the story, they're in a better position to make a decision."
When two opposing viewpoints are presented in this matter, the American public is NOT in a better position.  If anything, they are in a higher state of ignorance because the two sides (like, say, creationism vs evolution, or climate change) are NOT EQUAL AND NEVER HAVE BEEN.  Forced equal time, already practiced by many young journalists in the classic he-said-she-said (sometimes in their own ignorance, like when one went to get a pro-ID viewpoint and got YECer Ken Ham, who doesn't even like ID because it's not religious enough for his literalism) creates the impression that the issues are 50-50 when they are not.  Things get worse when one side is able to lay down single-sentence talking points that are all false but take 30 seconds to say but 5 minutes (to 5 years) to factually show why it is a blatant lie.

When strictly emotional appeals get involved (the abortion issue, for example), "facts" become meaningless.

The only sides that want a equal time and a fairness doctrine are the ones in the minority and for whom facts and evidence are against them: if they can limit the time that the evidence is given, their emotional appeals can win out in public opinion.

But when evidence and fact are given all the time they need to present themselves and show the lies and willful bias of the opposition (Kitzmiller), the liars' real agenda, an agenda of political control over YOU and your lives, especially your freedom to be free of their close-minded view of religion, becomes clear.
acroyear: (don't go there)
A wise teacher said this about 2000 years ago (or so it was written down around then), but the sentiment is far truer than the original context, of a money-man asking what he can do to enter the Kingdom.  Yes, the obvious, "money" or "a good spiritual life" is certainly true (would that the money-hungry evangelical and catholic churches have remembered this parable and lesson over the centuries...), but it runs far deeper than that.  It applies to everything, including the ethics of business itself.continues with comments on the ethics of public companies, employees, HR outsourcing, and the current crisis... )
acroyear: (makes sense)
The World's Fair : Beach Volleyball and the Public Understanding of Genetics:
But if a Jamaican wins a race and everyone says its because of genes, then why isn't anyone asking if American women have the Beach Volleyball gene? Why isn't everyone asking if the Chinese have the gymnastics gene? Why isn't everyone asking if the Kenyans have the non-skiing gene? Why are Kenyans so god-awful at nordic events? Why oh why could that be? Is it the non-skiing gene? Given the reasoning advanced in the comments to Razib's post, I'm left to conclude that this must be the case. And while we're at it, why isn't everyone asking if the Orioles have the bad-pitching gene?
I do note that - whenever American's win something, it must be due to hard work, personal sacrifice, and a bit of Good Ol' American Luck. Except when it's American technology, like the swimsuits, but that's still "America's system (in this case, pure get-ahead capitalism) just being better".

But whenever someone else wins, oh it's due to everything BUT the hard work of the athlete who really wants to win - it's the genetics, it's the intense state-sponsored training, it's the fact that they're paid for each medal, it's some innate advantage in the nation's geography or weather (nevermind that our athletes move to where the weather is suitable for what they do, often taking their families with them), or, well, they must be cheating in some way.  Or it's a sport that just "sucks" so why do we even bother with it.

It's never due to the idea that just like American athletes, the competitors from other countries just really, really want to win and work really, really hard to get there.  No, it can't be that.  There's no way the rest of the world can actually be just like us, right?

I don't know what's worse, that this egotism continues in the commentary and coverage, or that there's such a large portion in this country that still buy into that crap.

We have, this week, seen the ideal American in many contests, not least Beach Volleyball, Gymnastics individuals & all-around, women's soccer, several track&field events, and much more, including Bryan Clay who's actually doing damn well in the Decathlon.

But for all of those ideal Americans, I got two words that should flat-line those smiles and remind us what most Americans are really like: Bode Miller.

Stand tall.

Besides, sometimes positive genetics can have negative side effects, as this comment shows: Oddly, I now hear that in the US there is strong linkage disequilibrium between the genes for sprinting and not being able to hold a baton.. :)
acroyear: (this is art?)
Warning: this is a ramble, just a collection of thoughts as they came to me.  Maybe they'll be a point at the end, maybe not.

What was the generation that decided that anything produced by any previous generation except their own wasn't worth remembering?

This question's been rambling through my head for the last few days as I've been watching my way through the Muppet show season 2 (which very nicely, unlike season 1, has NO CUTS in it! :) ).

There's a lot of backlogged discussion on the web about Disney's acquisition of the Muppet characters and their inability to actually DO anything with them.  Constantly, they're listed as being weaker characters only good for the baby boomer (and their children, all like me now in their 30s) rememberence and nostalgia market.  Any attempt to bring them into the trendy 21st century, from Muppet Family Hour to Muppet Wizard of Oz has inevitably fell short, with the characters more rehashing their own old jokes rather than building on their personalities from within.
essay continues, mentioning Law and Order, Will Smith, Shrek, Pixar, and many more things... )
acroyear: (i'm ignoring you)
There's a site out there with a major essay decrying Disney's Animal Kingdom, and specifically the Kilimanjaro Safari, for how the inacuricies between the real nature and the nature as Disneyfied are indicative of a larger "problem" with modern culture in how it presents a distorted vision of ourselves.

Disney's Animal Kingdom as a Projection of the Unconscious:
The site is part of a larger effort to reveal the way contemporary culture is turning our surroundings into a mirror that reflects back a distorted version of ourselves.
Well, two responses quickly come back.  The first is trivial but critical.  The second is more involved, less concise, and more revealing on the misdirected efforts of the site writers.

Response 1: Douglas Adams.
Art: none.  The purpose of art is to hold a mirror to the universe.  There simply isn't a mirror large enough.  See point 1.
Point one being, of course, that the Universe is bigger than the biggest thing ever, and then some.  Douglas always was a better writer than most, including me.

Response 2: Since when has art EVER presented an accurate vision of ourselves to ourselves.  If we really wanted to see ourselves in a true mirror, we'd look in the mirror.

We don't.

We don't want to see the truth, the whole truth, and nothing but the truth.  We'd never tolerate it.  We'd never mentally survive.  To use a modern cliche, "we can't handle the truth".  Not because we can't imagine it or its consequences, but because we literally can't handle all the details.

All art is abstraction.  It is removing some (actually, most) details in order to make other details more prevalent.  It either shows us more clearly particular details we don't want to see, or shows us more clearly the particular details we absolutely want to see.  There is no in between.  In between is the reality we can't face because it is both too difficult and, more importantly, too complicated.  There are, literally, too many details.  The ability to not only be abstract (most animals have this capability) but to express that abstraction clearly to others, is a unique aspect of our humanity.

It is called communication.  We can not communicate every detail; there simply isn't time, and there certainly aren't enough words.  Try explaining the first 8 bars of Beethoven's 6th - Bernstein once took 15 minutes to describe everything that happened in those bars, those first 30 seconds of music, and still didn't cover "everything".  In fact, he probably described only 10 percent of all of the things going on during those 30 seconds.

We abstract.  It is the way in which we communicate.

Now, is communicating a distorted vision a bad thing?  No.  Rather, it's an inevitable thing.

The issue then is intent.  How is it distorted and what is the expected response of that distortion.  This is the question all recipients of art are expected to ask and answer, for themselves at least.  It is what makes art a participatory exercise even for the audience.

The conservation message, as presented by Disney, is an obvious distortion.  It is just as much a distortion of reality as their "true life adventures" were 50 years ago.  But then again, so was Wild Kingdom.  And National Geographic.  And every other nature series and every zoo on the planet that attempts to teach a conservation message in some form or another.

But how much of that is because the reality is simply so mind-boggling that it really can't be dealt with?

I have friends, including many here on LJ, who won't walk into the "ape" house at the National Zoo.  They simply can't deal with what they see as the forced imprisonment of our cousins, as if there was something distinctively separating our cousins from the other mammals (including the Pandas) in the park.

I say this.  1) there isn't.  Then again, there isn't much separating US from the rest of those mammals either.

and 2) Those apes are still alive.

The zoo is an artistic presentation of apes, warts an all, in a way we might be able to tolerate it.  Because we can't begin to imagine the reality of their existence in nature.

Trust me.  Don't read that article.  You would immediately want every single gorilla, chimp, bonobo, and orangutan on the planet in Western-style zoos, surviving, than have them being killed off in Africa by the excessive ignorance those people are following.  You would be praising the national zoo for its efforts to keep our cousins alive until the day when the men in Africa stop being f'ing assholes of the highest degree.

Is Disney's Animal Kingdom, with its separation of species to prevent predation, a distortion of nature?  Of course it is.  It's merely art.  A cultural artifact designed to communicate a specific message with an emotional content.  Does the impact of the marketplace change the message?  No, not really.  Popularity in art is measured in different ways, themselves reflective of the culture, but the desire of art to be popular is unchanging, nor does it change the message and the reason it is abstracted and distorted.

That reason is reality.  The reality that we literally do not want to ever see in our lives.  We really couldn't comprehend it all, in every majestic and gory detail, even if we tried.  It's simply too big.

The authors write:
The park is a giant materialized projection of the unconscious mind that has been turned into a fantastic environment. What it is really about is our narcissistic desire to feel like we are grandiose heroes and saviors, on the side of right, and our desire to enjoy the full-throated optimism that comes with the sense that the cup of the world runneth over and death can be conquered. It is about our childhood desire to see wonders and get prizes and surprises.

The park is similarly about our desire for quick and easy transcendence -- transcendence for the price of admission.
Today we call it "escapist fantasy".  If it was an older culture, we call it "myth".    This idea that it is "contemporary" culture that's got it all wrong, merely because it seems to be driven by marketplace values, is itself misleading: *ALL* cultures have gotten it "wrong".  They all have been misleading, have been distorting, have been promoting the "better" and/or ideal aspects of the society they developed in rather than present reality as it is.  It is how societies *survive* without going mad.  Hiding the worst or displaying that worst with an objectivity allows one to be able to relate to it without the guilty conscience that leads one directly to madness.  Again, "art".

So why should DAK, or any other aspect of contemporary culture, be any different from any culture of the past.  If it wasn't, would it really be "culture" anymore?

There are many ways in which the authors seem to think that art and "culture" should be bigger than it ever really is.

To them I say, "see point 1".
acroyear: (epcot mine)
Originally posted for disneyworld , cross-posted here for my own records.

In the Disney blogosphere and "behind the scenes" literature, a lot of things are blamed for EPCOT's initial failures and subsequent inability to maintain a steady crowd and a loyal fanbase beyond the "geek" market.  Walt's death and the inability of the Imagineers to build Walt's original vision (as we've now been able to see in full on DVD) ranks pretty high.  Budget crises and crunches also come up.

"Lack of thrill rides" was Eisner's motto, but his subsequent development has split Futureworld into two - a thrill side (Test Track, Mission: Space) and a kids-friendly (semi-)education side (Seas, Land, Imagination).  Rather than being a place the family stays together in (Walt's success in Disneyland), its become a place where families split up - mom and the under-10s go west, dad and the teenagers go east, and Grandma and Aunt Louise head south for the World Showcase.  It's a very Balkanized place right now and it's unlikely to change in that respect.

One easy target is just the typical audience - nobody wants to be "educated" on their vacations anymore (if ever - see how the educational parts of both coasts' Tomorrowlands are gone).  Horizons was doomed from the start to disinterest from a crowd that had to think too much and wanted a day to stop thinking.

But really, there's a much simpler explanation, one Disney could have done (and done beautifully) but didn't.

EPCOT: just a spot on the map... )
acroyear: (claws for alarm)
Or specifically the question of why did the USDA argue against Creekstone's intentions to test their beef in court and why they seem to be defending the cattle industry's "right to profits" over  their role of protecting the consumer from harm...

Background: Creekstone wanted to enter the market of selling beef to the Japanese.  The Japanese required that they test everything, just as they do with their own, in order to do so.  Their standards are much higher than ours (like the British, they've had problems in the past), and have already banned almost all beef imports from America.  Creekstone wanted to oblige to enter that market (since profits in America are pretty static) and so designed a program where they would test everything to the Japanese standard.  The Cattle Industry (ie, the rest of the competition in America) complained that they could use that testing as a marketting ploy in America and competition would require that they all meet that same standard.  This they say they could not afford to do.  The surprise in all this was that the USDA sided with them (for increased government regulation to protected corporate profits) rather than with the attitude of free enterprise that says a company should be allowed to do anything ethical to remain competitive and secure an advantage.

So the question was, WHAT THE FUCK?  Why would the USDA be siding with corporate regulation against free enterprise in a manner that appeared to be against the consumer's best interests?

The answer?  Like all government safety regulations, especially, say, car safety, it's playing the odds.  How many would be hurt by it vs. how badly would the industry suffer to the point of non-existence.

As I wrote:Effect Measure:
what has the cattle industry got to fear?

Losing so much profit by having to change their testing procedures to compete with Creekstone that they end up going out of business and we ALL starve.

The problem with commodities today is that there's (almost) no profit in them except by reducing costs and going mass-production on a scale unprecedented. Even then, the costs of labor increase while the cost of beef is forced cheaper by customer demand and government regulation.

So for the companies that are on the edge of profitability, adding that test might eliminate what little profit they have. Result: chapter 7 or 11, and while the courts are cleaning up the mess, no beef is produced, more money is lost, and the cost of beef for the consumer goes up.

Thus, the USDA acting to protect customers by protecting the corporations. They act as if the loss of the corporation through economic competition is more detrimental to the industry as a whole and the American consumer than the extremely rare case of a diseased animal making it into the food supply.

Like any insurance (or assurance) system, it's playing the odds and statistics with people's lives. We normally don't notice it until it personally affects us.

Better that a handful of people get sick and die than the entire nation starve because the corporations that feed them go out of business with no replacement.

Or at least, that's their view. Treat society as a whole and it makes perfect sense. Treat society as a collection of individuals with equal rights to life and its utterly repulsive.

Now, the problem is that the Declaration of Independence demonstrates that the founders believed in the latter, and THAT's why the attitude of the USDA and the corporations, while "American" in terms of free enterprise, is Unamerican by the standards our founders had established.

Note I'm not saying it's illegal or unconstitutional, merely unethical by the standard the DoI established as a minimum.
Now the problem with this attitude of government/corporation protections is exactly what was demonstrated by Al Gore (An Inconvenient Truth) in emissions standards in automobiles.  While keeping design and manufacturing costs cheaper so that there's that little bit more profit, it cuts us off from export markets and restricts our market to merely the 300 million American citizens rather than the 6 billion people in the rest of the planet.  We've cut off our own noses and decided that a market 50 times the size of our own isn't worth the minor loss in profits in our own.  We've stopped taking risks. 
*Actually, there's another answer to that: our labor costs are too high so that there's no way would could compete on price alone and our higher quality standards alone wouldn't be enough of a selling point to markets with so little money.  We know this for a fact because it's exactly the argument made (and proven) when mass production with unskilled labor was introduced to other industries over hand-crafted works like furniture 100 years ago.
We used to be the "breadbasket of the world".  Now in spite of all of the wheat we grow, our pets are getting sick by possible Chinese grain that came into America to that plant that's right there in Kansas where all the fucking grain is being grown!!!

That's just insane, but that's what corporatism has done to us.

And NEITHER party is doing a damned thing about it.
acroyear: (hick)
Positive Liberty » Blog Archive » The Fugitive Slave Law:
I just have one item to add to Brayton’s post, and that is to mention that, at the time of the Civil War, the most horrendous attack upon the states’ autonomy, the worst assault on federalism that had been seen in the history of the union, was the Fugitive Slave Act. This act arrogated to Congress what amounted to a police power within the Free states, or anywhere else for that matter. It also directed the officers of state and local police agencies to arrest fugitive slaves or face federal penalties. Prior to this, state authorities were free to ignore federal fugitive slave laws, and it was often argued — successfully, and on the basis of English common law — that a slave was emancipated simply by entering a free state.

I do not believe that a similar federal police power was again to be seen, outside of martial law, until Prohibition. The Fugitive Slave Act’s greatest champions were all from the Slave states — that is, the very states that would later whine the loudest about states’ rights. They cared about slavery; all else was a means to that end.
For those that think that whining over "slavery" vs "state's rights" is mere historical guessing with 20/20 vision, it really is much larger than that.

The very issue at the heart of the 14th amendment is whether or not someone's rights are consistent with regards to the federal government no matter what state they live in.  The 14th, basically the final peace treaty between north and south (and, yes, forced upon the southern states as a requirement to rejoin - that's what victory in war gets you), asserts that the federal constitution and bill of rights, as well as all federally recognized rights that exist through the 9th amendment, holds over ALL.

The Fugitive Slave Act is a violation of that principle (though not the amendment since it wasn't written at the time).

So is, in my interpretation (and others), the Defense of Marriage Act.  Your rights are different with regards to certain federal law provisions and protections depending on what state you live in.  Now the view of this depends on whether or not you feel that marriage is a fundamental right under the 9th amendment or not.  I believe it is.  The irony, just like the "State's rights" vs "Fugitive slave law" irony of the past, is that those who most feel it IS a fundamental right and would argue such are the very ones who are trying to limit it to just *their* type of marriage, just as their argument for "freedom of the states" was really just for *their* particular version of "freedom" at the time.

The discrepancy between such views is why the government (at ALL levels) should just be getting out of the "marriage" business altogether and create a standard "family" contract, of rights, privileges, and responsibilities,  to be protected and enforced under contract law.

If marriage is a sacred "rite" then fine, let the churches do with marriage what they feel like and let the irreligious do with marriage what they like.  But aside from the right to perform such a rite, they shouldn't expect any other protection for it at all.

If marriage is a social contract, then fine, it's a contract.  So treat it as such, fairly and for all, with no discrimination.

But as long as one side argues it's the one (a sacred church rite defined by a God whose authority this Constitution rightly does not recognize given that there's so many differing definitions of him), and then try to weasel their interpretation into the other (a social contract recognizing some minimum protections and rights within the members a household), this nation remains just as divided and discriminatory as we were 150 years ago.

Always remember - 40 years ago, under the definition of "marriage" according to the religions whose discriminatory beliefs dominated the law, I would not have been allowed to marry my wife.
acroyear: (makes sense)
There's a bit of research going on about how much the type of music "says" something about the person who actively listens to it.  Most of their assumptions on the importance of music as a social filter seemed to be supported by the evidence, but they acknowledge that their survey was limited to the easiest subjects to find for college grad students: college undergrads.  :)

Chris @ Mixing Memory, after summarizing the report, notes that their are likely expansions of research into other age demographics to see how well it extends beyond school ages.

I took some time to think about it, in reflection with my observations of myself and my communities as an adult (and contrasting how I approach or discuss music today with how I would have 20, 15, 10 years ago), and have basically concluded no - they won't find music as nearly the important factor, or more importantly, filter, for social interactions that it was for people still in school.

Mixing Memory : What Does Your Music Say About You?: (comment by me - )
Looking at the bigger picture, my "back of a napkin" guess would be that you would find music's importance decreasing as people get older.

Consider that when dealing with a large, diverse population, people tend to look for common ground with which to to strike up a conversation, and from there a friendship (or more). Hobbies are obviously high as means for association because most people who associate by the hobby do so likely because they met through the hobby. Certainly true for gamers, SCA, amature athletes.

But when that's not immediately available, when you're dealing with a ton of total strangers in your dorm, your classroom, your orientation session, then you need other filters to guide you to people you might agree with on enough topics to hold a friendship together.

The thing that puts music at an advantage over movies or tv is the choice factor. Yeah you have some choices over the tv to watch or the movies to see, but those movies and tv shows are time-limited. Often, obsessing over TV is usually treated as a sign of weirdness or "geek", especially if its sci-fi or a soap opera - certainly useful for geeks to unite around, but they usually can already recognize each other without actually having to broach the subject, by the merchandise attached to their clothes or their stuff.

Movies are extremely tied to time - their range of being a topic of conversation that someone might feel comfortable knowing anything of is rather small, and to make matters worse, talking about movies is actively discouraged because of the infamous "spoiler" problem: you can't talk about a movie with enough detail to appear knowledgeable to someone who hasn't seen it if you're trying to encourage them to see it. If they've seen it, you're preaching to the choir and look bad. If they haven't, you're spoiling it. If they have, but your analysis goes too deep, you're seen as a geek just like TV (Kevin Smith's various Star Wars references play out this stereotype rather nicely for the laughs - but of course, I now read like a total geek for having been able to refer to Kevin Smith, supporting my own argument).

So music, the one thing people all have generally some opinion on, the one thing that can't be spoiled just by talking about it, the one thing where the choices one has to what they are fans of aren't temporally implanted on them by the media companies except for the absolute latest-and-greatest, and even then there's a longevity about it that's not seen in movies. Back in the 80s, following a song up and down the American Top 40 charts could take months, and long after that it might still make MTV rotation. That still exists today, though yes the time period has gotten shorter.

But to adults, such things are less important in how we relate to each other. They're things we share still, but not nearly as often or with such emphasis. The reason is because we aren't introduced to such extreme diversity nearly so often. Usually our biggest change of surrounding people is when we change jobs. If we change towns as a result of having to change jobs, the hobbies will become the primary means of finding new people to associate with FAR more than anything as (now recognizably) arbitrary as music.

In short we're no longer just looking for people to talk to, we're looking for people to do stuff with because "doing stuff" is no longer provided by the environment we're in. In school, we're required to play sports, be in band, participate in class activities. None of those requirements are forced upon adults (and woe to the HR department of the company that tries to force it).

So we look for people to do stuff with, and music (unless you're a heavy concert goer or an amature or semi-pro performer) simply isn't something people just "do" once they're adults.

Disclaimer: I'm 36 (and married). I wouldn't have said anything like this when I was 25 - music was still the most important topic of discussion and filter for finding friends in my life at the time.
acroyear: (bad day)
The two primary modern justifications for Daylight Savings are that it saves energy and that it encourages more retail shopping since people tend to be out doing stuff later if the sun is later.

Neither of these justifications were party of Ben Franklin's original proposal, where he merely took the pragmatic issue of why waste having the sun out in the morning hours at 5am if nobody's awake to do anything with it.  His view was much more oriented towards its obvious advantage for farmers.  Energy had nothing to do with it though it would indeed save candle consumption.  His proposal was actually more in humor than seriousness and he didn't actually suggest changing the clocks themselves.  The first clock-changing proposal was from a golfer wanting that little extra time to play before dusk hit.  And while Franklin suggested it might be better for farmers as well as merchants, it was the farmers that first led the repeal of DST in 1919, with enough support in congress to override a Wilson veto.  Farmers hours are set by the sun and nature itself, no clock will ever change that.

Taking advantage of light was the exact reason the Germans adopted it during WW1, which then (like all WW1 innovations) was immediately adopted by the allies merely for their own survival.

The trouble with the modern justifications is that they are, in fact, completely bunk.

In the 50s, yes, it was more likely that a company had fewer lights on in the office building or factory floor and would take advantage of daylight as much as possible.  This is no longer true today.  All offices are flooded with florescent bulbs, on across the entire floor even if only one worker is there.  No "energy" is saved by changing light bulb usage because light bulb usage simply is never changed.  We're in, its on, that's that.  Indeed union safety rules have also led to factory floor lights being on practically 24/7.

In fact, more energy is would be used because the modern A/C systems (which also didn't exist in the 50s) have to work harder when the sun is out than they would if darkness started to kick in.

A study in 1975 found that DST saved a whole 1% of electricity costs.  In terms of nuclear power output, contrasted with trying to reduce coal plant production and emissions back then, 1% is negligible.  Mexico's DST introduction only reduced energy by 0.7%.

So much for saving energy.  It doesn't, and really it never has.

Now what about businesses getting more business?

Or really, the question is "so what if its true"?

Because the reality is that while retail may be having a better time at it, all of the support structures that retail depends on are having a *really* crappy time of it.  Estimates are that the stock market losses the Monday after the switch, due to the lack of sleep of investors and floor traders, can run into the billions.

This current change has caused the major software makers to have to rush patches to their systems without adequate time to test them, and certainly there is much software in use that is not being properly maintained and will never be right and have to be hand-managed.  One of the really insane difficulties of this one is that many small localities chose not to adapt the DST change, which now means that software that has to know about the many time zones has had the number of time zones to track *triple* to account for all of the little variations that have popped up as local legislatures have voted to not adopt the federal proposal.  This has naturally led to a large number of bugs in the patches themselves that are trying to fix the problem.

Microsoft has put out 3 separate patches to its Exchange email/calendar system and still hasn't got it right.  Meeting schedules have bounced back and forth to an hour off in both directions as a results of each patch, many of which assume that the prior patch had NOT been applied and so overcompensated.  If my own company has had this disruption causing every worker to have to correct their calendars, how about a large Fortune 500?

Microsoft also formally did not support fixing Windows 2000, trying to use that as the final straw to get people to upgrade to XP or Vista.  Trouble is, otherwise Windows 2K *still works* (as far as "works" means to Microsoft, of course).  Most people don't need to, and in fact their fully functioning hardware would not handle the upgrade well since the minimum requirements for XP are higher than a 2K box.  They did eventually give in and sell a patch to fix 2K, but at $4000 a site (apparently a little cost and price fiddling to have the 2K patch, which is no bigger than the XP one, cover the expense of fixing XP as well).

Sun has also had to fix Java implementations that, because of compatibility issues, can't be upgraded to more recent versions.  Even then, they didn't get it right the first time, now causing 10 people in my company to waste another 2-4 hours each (napkin guess: $100/hr * 10 * 4 = $4000 we'll lose, minimum, all because Sun screwed up - and that's not counting the amount of work our customers will have to do, with a requisite downtime necessitating after-hours work, to apply the patch from Sun that we forward to them, added onto the amount of work we did figuring out and running the first patch we've already sent). I am, of course, one of those 10.  My total time has already been 16 hours arguing with and testing this problem, and I'm about to throw another 4 onto that.  20 hours of my time wasted.

And so you know I have now even less respect for the asshole that made me waste 20 hours of my time better spent doing real work.

I hated this administration before for general reasons.

But this time, its personal.

That's just 1 small company, out of thousands.  If we have had to waste $100 * 10 * 20 = $20,000 over this issue (and that's being generous), multiply that by the 500,000 companies using Java and you have $10,000,000,000, yes $10 BILLION, wasted.

No amount of "increased retail sales" will ever make up for that.

Ever.

And that's just in dealing with Sun's Java problem.  Now figure in Oracle/Sybase/MS SQL/MySQL, MANY Linux versions, often different and rarely upgraded (again, don't knock a system that's working), Solaris and HPUX, the tons of software and libraries used by IBM customers, and it *really* adds up.  I wouldn't be surprised if total estimates on this problem actually exceed spending on the Y2K problem 10 years ago.  Much of this problem is because the software now can no longer assume that the underlying operating system has been upgraded - Java and Oracle and all of that has to run on Windows 2K  or Unix even if the box hasn't been patched, so it has to recognize whether or not the OS is right and adapt accordingly.  This is not an easy task, nor is testing it by the end consumer.

To be perfectly honest, we've spent $20,000 and we actually honestly have no idea if any of it worked.  We really don't.

And that uncertainty is more costly than any piece of software we've ever written.

It should have been obvious when this was proposed that any time this administration is certain of something, we should all be certain that the only thing that will result is more uncertainty.

You want to know why the economy has *never* really rebounded since 2001?  Uncertainty.  Wall Street hates it and always will, but unless you're Halliburton or Exxon/Mobile, uncertainty is all this administration has ever given you.
acroyear: (rock)
Granted, its the "entertainment news" and nobody gets their facts right, but still...

After 15 years, Genesis is back -- without Gabriel - Yahoo! News:
The rock band Genesis will tour for the first time in 15 years this summer, but without former front man
Peter Gabriel, the group said on Wednesday.

[...]

Collins remained with Genesis until 1991, when he began performing as a solo artist. In 1997, the remaining members of Genesis released "Calling All Stations," a commercial disappointment.
Lets see now...
  1. No mention of long time guitarist Steve Hackett at all
  2. Too much of a mention of Gabriel, really, given that he was never involved in the major hits of Genesis and few people except die-hard prog fans have heard any Genesis from before the split.  Typical Gabriel fans are even less aware of his Genesis time than Genesis/Phil fans.  Granted, the fact that this is the 40th anniversary of the band implies a deeper reunion should have been done.  They also don't say why Gabriel decided not to (facts on the public record from 3 months ago when the idea was first proposed), but that's typical in that the writer only rehashed what was said at the press conference and did little backstory research as you can see with the following...
  3. Phil began performing as a solo artist in 1980, with Face Value recorded in between Duke and Abacab
  4. Phil was touring solo by 1988 (after the Invisible Touch tour), and had a hugely successful 1990 tour
  5. Genesis released We Can't Dance in 1991
  6. Genesis continued to tour for a year and released a live album in 1993
  7. Phil didn't formally retire from Genesis until 1996 (though again the band were in their normal between-album hiatus after the tour and live album), deciding that his solo career and the Disney offer to do Tarzan was more in line with the direction he felt he needed to take
They also under-emphasize his statements that he'd retired from live performance in 2003 for fear of further damaging his hearing.

All of the Phil and Genesis facts are easily researchable on the public record.  Hey you Reuters entertainment weenies: at least check your facts against wikipedia before you embarrass yourselves.

BTW, I can't go to the DC show as its at the Verizon Center on Sunday, Sep 23rd, which is, of course, a Faire night.  Not that I'm of the type to spend $50 minimum on a concert and deal with nosebleed seats with 20,000 people anymore.

Yeah, I've gotten old.
acroyear: (foxtrot reverse psych)
prompted by a discussion from [livejournal.com profile] jocelyncs . this is slightly extended from what was originally left in the comments there.

the converstation led to the idea that Johnston's characters' "lives" were becoming her own.  I disagree with that simplification, thinking there's more to it than that.

actually, i think the "life" thing is the problem as a whole.

for most of the strip, the characters were a reflection of her family and her personal life, abstracted with the humor emphasized because, well, its a comic.

the problems started after her own kids finally grew up and moved out. she no longer had prime source material to draw from.  she had already invented April, seeing as she didn't have a third kid of her own, merely to give space for other memories of her own kids combined with the stories from friends of hers who still had young kids that she could relate to.  For at least 10 years now (given that her kids are as old as I am), she's had no kids in the house, no drama, nothing new to write about as she sees these other people coming up with crazy stuff for her to capture and exaggerate for all to remember.

the result?

she's had to make it up, a lot more than she would have before.

Some of what she makes up, perhaps, may come from her sensing a need for new drama in her life, since as a successful businesswoman, artist, and mother, she's pretty much had the ideal life most can just dream about.  While you have kids, there's always *something*, but if you don't have that drama continue after your kids move out, the complacency can be maddening.

when an artist paints reality, its always relevant and we can always relate to it.

when an artist has to *invent* a reality to paint, we notice.  we see patterns drawn not from the absolute human experience we all share, but from the specific example of things that we know happen but not always to us. things look exaggerated rather than believable; coincidences become contrivances.

this is the point where something becomes a "soap"; the situations are contrived to give the characters something to react to, rather than the characters themselves being the driving force behind the direction lives take.

that in the end leads to the discomfort. characters we can relate to are those that take control of their lives; the characters in FBFW *used* to do that, but as she's grown older and her family has moved out and she no longer has the examples around her of how people do that, things happen to her rather than by her. By extension, things happen to her characters, rather than her characters doing things to others.

doonesbury, by contrast, has these things happen to his characters in order to give a personal touch to the impact of the actions of those in power.  its a completely different attitude.  to give an example, the most striking of any comic strip incident (more so than any of the 4 deaths he's had), BD wasn't hit because Trudeau wanted to have more drama in his strip.   he was hit because Trudeau had been collecting in his head a thousand war stories from friends and fans, some of which he knew from the first gulf war.  Those stories needed to be told to a wider audience, an audience that needed to be shaken up to the reality of the world, the war, and what's its doing to people.  there's the difference.  Trudeau creates coincidences in his characters' lives order to show reality to the audience, particularly those aspects of reality that are the personal choices of men in power who know nothing of the consequences of their decisions, or worse still, intentionally ignore those consequences.
acroyear: (bird)
Just wondering how the labellists out there decide the grounds between what belongs in classical and what belongs in "new age" (not that THAT label has any real meaning anymore, either).

Case in point, I picked up on a whim, Steve Reich's 18 and I absolutely adore it, but aside from the instrument choices, it doesn't really strike me as "classical".  Its a product of its era, and sits more comfortably side-by-side (on my shelf, anyways) with Tangerine Dream, Jean-Michel Jarre, early pre-rock Mike Oldfield, and a large amount of Robert Fripp's output, along with a number of various synth artists out there.

The only difference seems to be that Reich's performers, like Glass and Riley, play every note, while the pioneering synth artists learned how to program a computer to do the repetitiveness factor that would drive must human musicians nuts.  Certainly one can debate which is harder, playing such patterns consistently to the point of achieving music or programming the old sequencers to do the same, and one can marvel at how the patterns that Reich developed are now so subconsciously standard that pretty much any sequencer has them built-in out of the box, ready to be abused in creating another "pop" standard.

So really, when the late 20th century is looked at in hindsight, will these artists still be separated into the labels ("classical", "new age", "rock") that they currently are in (from a marketing view, not a historical one), or will successive generations look to this school of the late 20th century as a whole?

Actually my real question is can labels change as music is looked at from the hindsight perspective?

For a while, the Brahms/Schumann vs Wagner/Bruckner distinctions seemed to overwhelmingly split the view of the 19th century, but Mahler impressively combined the two styles (stretched symphonic form of Brahms plus the harmonic ambiguity of Wagner) into one, while Debussy and Stravinsky launched into realms of dissonance and tonal ambiguity as to make the critical fight between the two sides seem downright silly.  We just look at it all as late 19th century music now, reflective of the attitudes of the time.

So will the same happen for our century?  Will Stravinsky's embracing of serialism at the end of his career (I finally dug into Agon this week and adored it) be the joining point that will have historians stop treating the two main schools of mid-20th century music as separate?  Will minimalism's influence on electronic music (and vice-versa) lead to a unification of it and that subset of "new age" in the historical view?  How would such a merging evolve, critically, when we're all still so tainted by the labels attached to things in the record stores today (something that wouldn't have had an influence on how the early 20th century saw 19th century music)?

just random thoughts.  seeing how history evolves as new primary documents are presented, i tend to wonder how things today will be viewed differently given our *awareness* of the importance of primary documentation.  as more of how we view the world today is preserved than decades past, will our view taint how successive generations view this one?

(this all started in my head over 20 years ago, when my father once insisted that Dark Side of the Moon will someday sit side by side with a late Mozart symphony in the respect of music lovers in the future; little did he know that XM's Fine Tuning would almost do exactly that!  he's been right before; back then he also predicted that we'd be able to hold a symphony in a tiny chip on our fingertip, which is exactly what you can do 100 times over with a 1-gig SD card today).
acroyear: (space)
1 Year Ago: Scientists, science-teachers, and science-bloggers (including me) across the country celebrated the 139 pages of artwork in the Kitzmiller decision, showing that the courts could recognize what we all knew: ID is nothing but re-packaged creationism.

10 Years Ago: we lost Carl Sagan.  Anniversary memorials are pouring in the science-blogging community.  Ann Druyan, Carl's wife and collaborator, is the best place to start if you're interested in reading about him today.

For myself, I couldn't possibly express the importance of Cosmos to this susceptible 10 year old.  Much of my mind-set came from that work, but in stages as I grew older and was able to interpret more of it.  First I learned the facts of the planets.  Next I learned of the people who discovered those facts; Cosmos was quite particular in also talking about their personal flaws and the mistakes they made along the way.  Next, in combination with James Burke's Connections, I better understood the order of things, how each one necessarily discovered what they found because of what came before them - there are few "eureka" moments and never a "see, I was right all the time" sentiment came from any of them.  Finally, I see the interconnectedness of all things (yeah, call me Dirk).  That you can segue from Whales to Neurology may seem odd, but like any bridge in music or poetry, it's just a matter of making the connections.

Artistically, I also learned much - one "split" in me is that I do tend to label music even as I can argue and insist that "it's all just music".  Cosmos was an inspiration in that as well.  The background music freely mixed baroque, classical, avant-gard, film music, early electronic "new-age" (Tomita, Vangelis), and rock and blues, as if there was no reason to divide them. Because there wasn't - it was all music, all valuable, all good, all part of our terrestrial heritage.

Extrapolate to science and I think you can see into his mind: all the sciences are the same - they are science.  Though his specialty was astronomy, he could present all the sciences equally because they are equal.

From Cosmos, I also learned a bit about skepticism, something delved into more deeply in his seminal work, The Demon-Haunted World: Science as a Candle in the Dark.

But I also learned about Hope.

While "hope" remains a major part of Jesus's message, in modern times it is tainted by a fear - fear of the unknown, fear of punishment (just and unjust), fear of that which is different.  The church as we know it can harbor a deep darkness, one which even in my own church I am no longer comfortable being around - the irony of the church claiming to be a source of light and illumination is not lost on me.  I was fortunate that my mother also would not tolerate the darkness in some mens' and some churches' hearts, and picked the church we grew up in because of the light they chose to bring rather than the darkness many choose to present as the reality of the world that they claim to be the savior to.  Such churches breed the pain we have today and I refuse to believe that is the message that Jesus wanted to present by any stretch of the imagination.

My childhood church didn't live in fear - it lived in hope.

Carl Sagan's Cosmos also lived in hope.  Everything is possible in science.  Not only possible, it's likely, as long as you don't give up, fall back on the "known" and just say "it can't be solved" or "God did it, that's enough for me."  Such thoughts are a dead end for knowledge, and as such a dead-end for our species and our future.  Yes, every answer raises more questions, just as every missing link is two more gaps in the fossil record.  Such is the ultimate in hope - that there is always something more to know, so we can always keep learning.  In this, there is always hope for the future.

Where "Star Trek" is the embodiment for some of what we hope to be, Cosmos presents Carl's hope of how to get there

Or to get whereever else we hope to be.

For Carl, keep hoping.

Profile

acroyear: (Default)
Joe's Ancient Jottings

April 2017

S M T W T F S
      1
2345678
91011 12131415
16171819202122
23242526272829
30      

Syndicate

RSS Atom

Most Popular Tags

Style Credit

Expand Cut Tags

No cut tags
Page generated Oct. 18th, 2017 05:23 am
Powered by Dreamwidth Studios