Tragic events like yesterday's school shooting bring out the best in people. Teachers and pupils have been praised for timely actions that saved many dozens of lives from Adam Lanza's rampage. But, unfortunately, tragedy can also bring out a sense of brazen self-righteousness by those who believe they have a stake in the fall out. I wrote yesterday how the smoke that normally clouds America's gun control debates gets thicker in the immediate aftermath of yet another entirely preventable mass shooting. And sadly, so it has proven.
Scant hours after news of the shooting broke, Bryan Fischer, a talk radio host on the Christian American Family Association was quick to score some theology points with his bible-thumping listeners. He suggested secularism was to blame. You see, if there were classroom prayers God would have intervened to prevent the shootings. Apparently, there were no school shootings when Fischer was a boy because they honoured God with their daily prayer. Okay then. Absolutely nothing to do with the seismic shift toward a more individuated, atomised and violent culture at all.
Of course, with heavenly protection on your side Fischer concludes nothing has to be done about weapon proliferation, and so the Second Amendment should remain. Mike Huckabee, former Arkansas governor, Republican presidential hopeful and Fox News contributor (a political health warning label if there ever was one - pictured) argued exactly the same point. He said "We ask why there is violence in our schools, but we have systematically removed God from our schools should we be so surprised that schools would become a place of carnage?" (source) I think the ready availability of guns has more to do with it, Mike.
You've got to appreciate the Christian right's famous compassionate conservatism.
But in the conservative arms race to offer the most inhumane and counter-intuitive responses, Ann Coulter has taken an early lead. Her "solution" isn't intensive sermonising or gun control. No, what America's schools need is a proliferation of guns. Coulter argues that if more people were allowed to carry concealed weaponry, and if guns could be carried on to school premises, mass shootings would cease. Apparently, it's the defencelessness of schools that make them such inviting targets for killers like Adam Lanza. Nothing to do with the social-psychological complexes of real and imagined grievances such men have, nor the ease with which the unstable and the dangerous can lay hold of automatic weaponry. So, in all seriousness, Coulter is saying she is happier to see American classrooms witness shootouts between teachers and gunmen than have the inalienable right to own assault rifles challenged.
So much for the misanthropic extremism that passes for mainstream conservatism in the United States. Over here the usual idiots have so far kept quiet. Nazi Nick Griffin is away addressing fascists in Catalonia, apparently. There has been some back and forth on Twitter over whether the EDL did or didn't tweet this, and it will be Thursday before the Weekly Worker publishes the inevitable riposte against gun control on the grounds of its workers' militia shibboleth. It's just as well precious few people take the extremes of the right and the left seriously.
None of these are what any reasonable person could count as serious contributions to the gun control debate. It's trolling for the sole purpose of preventing such a frank and evidence-based argument from taking place.
But of course, it falls to the Daily Mail to shame us with its appalling journalistic practices. From their coverage of the shootings this morning, they wrote:
The Mail doorstepped the grieving parents of one of yesterday's victims. Where would we be without our free press?
(H/Ts to @EDLNewsXtra, @johnthelutheran)
Sabtu, 15 Disember 2012
Trolling the Connecticut Shootings
Jumaat, 14 Disember 2012
Connecticut School Shooting
It's heartbreaking. At least 27 people, including 18 children between the ages of five and ten have fallen victim, yet again, to a lone gunman. Reportedly, the young survivors were told to keep their eyes shut as they were led from Sandy Hook Elementary School (pictured), in the quintessential suburban American community of Newtown, Connecticut. You do not want to imagine the grim vision visited upon its corridors and classrooms surviving children had to escape from. I'm sure everyone, everywhere with a decent, compassionate bone in their body stand in solidarity and sympathy with the victims, parents, staff and anyone connected to the tragedy.
But it's not a tragedy that was inevitable. Radical filmmaker Michael Moore has suggested at least 31 school shootings have happened time and again since the awful watershed moment of the Columbine School shooting in 1999. According to this report by the US children's charity, Children's Defense Fund, 5,740 kids were killed by guns between 2008 and 2009. Sadly, there's no reason to believe this quiet, unreported and unnecessary waste of life has not lessened over time.
The Whitehouse was quick to suggest that "now" is not the time to begin discussing gun control. But when is there a more appropriate time? This isn't an "academic", sanitised debate like many political questions. As has been demonstrated with depressing regularity, gun control in America is a matter of life and death. The longer the prevarication, the longer Second Amendment fundamentalists like the US National Rifle Association frustrate and obstruct a clear public addressing of the issues, the greater the number of young lives are cruelly cut short. And the obfuscation doesn't lessen when a tragedy like this occurs. If anything, the smokescreen gets thicker.
The gunman, named in local media as 20 year old Adam Lanza, who lived nearby. He apparently murdered his father at home, before travelling to Sandy Hook where his mother worked as a teacher, and then went on his subsequent rampage. His life and motivations are sure to be unravelled over the coming days, but it is very likely the familiar pattern of why men (particularly) murder in this way will be established. It must be remembered that appalling crimes like this are not unfathomable, or unexplainable. They have triggers and causes that have long been established by psychologists and sociologists of crime.
Gun control isn't a magic catch-all that can prevent future Sandy Hooks. If our collective wisdom readily understands how and why individuals like Lanza commit mass murder, it's not beyond our collective capacity as societies to address the endemic violence that suffuses our culture, nor provide free-at-the-point-of-need mental health services. A holistic preventative strategy would require a significant cultural and policy shift in America. It's difficult and much easier said than done, but the memory of the victims deserve nothing less.
But it's not a tragedy that was inevitable. Radical filmmaker Michael Moore has suggested at least 31 school shootings have happened time and again since the awful watershed moment of the Columbine School shooting in 1999. According to this report by the US children's charity, Children's Defense Fund, 5,740 kids were killed by guns between 2008 and 2009. Sadly, there's no reason to believe this quiet, unreported and unnecessary waste of life has not lessened over time.
The Whitehouse was quick to suggest that "now" is not the time to begin discussing gun control. But when is there a more appropriate time? This isn't an "academic", sanitised debate like many political questions. As has been demonstrated with depressing regularity, gun control in America is a matter of life and death. The longer the prevarication, the longer Second Amendment fundamentalists like the US National Rifle Association frustrate and obstruct a clear public addressing of the issues, the greater the number of young lives are cruelly cut short. And the obfuscation doesn't lessen when a tragedy like this occurs. If anything, the smokescreen gets thicker.
The gunman, named in local media as 20 year old Adam Lanza, who lived nearby. He apparently murdered his father at home, before travelling to Sandy Hook where his mother worked as a teacher, and then went on his subsequent rampage. His life and motivations are sure to be unravelled over the coming days, but it is very likely the familiar pattern of why men (particularly) murder in this way will be established. It must be remembered that appalling crimes like this are not unfathomable, or unexplainable. They have triggers and causes that have long been established by psychologists and sociologists of crime.
Gun control isn't a magic catch-all that can prevent future Sandy Hooks. If our collective wisdom readily understands how and why individuals like Lanza commit mass murder, it's not beyond our collective capacity as societies to address the endemic violence that suffuses our culture, nor provide free-at-the-point-of-need mental health services. A holistic preventative strategy would require a significant cultural and policy shift in America. It's difficult and much easier said than done, but the memory of the victims deserve nothing less.
Khamis, 13 Disember 2012
Majority Against Benefit Cuts Graph

However, I am very pleased to see the Ipsos-MORI poll (H/T LabourList) has turned political common sense on its head. Far from EdM being estranged from the public's thinking, it turns out our gilded chancellor is the one who is out of touch.
This is very welcome news. It underlines the stand Labour has made on Gideon's real term cuts to benefits, and, once again, enhances EdM's reputation as a shrewd operator.
If there is no electoral mileage in the Tories bashing the poor, who will they turn on instead?
Label:
Conservatives,
Labour,
Politics,
Social Security
On 'Post-Democracy'
Here's an old book review I wrote on Colin Crouch's Post-Democracy. Crouch's small book was published in 2004, and this review appeared a year later. Where political debate is concerned, many (if not all) the problems addressed by Crouch's rather pessimistic argument still exercise the political imagination eight years on. Do I agree with what I wrote then, now? Yes - though I would definitely not try and hitch this piece to the then already-dated anti-capitalist bandwagon. Consistent democracy is necessarily socialist. A politics that wants to reverse the tendency to post-democracy is also one that cannot but help recast social relationships and challenge the supremacy of capital.
At a time of falling participation in elections and a widespread scepticism toward parliamentary politics, Colin Crouch’s new book is a timely diagnosis of the key problems facing advanced liberal democracies.
As the title suggests, Crouch’s thesis is that Western democracies are approaching a condition of ‘post-democracy’. Evoking the metaphor of the parabola, Crouch argues the democratic forms established at the high point of liberal democracy (located in the immediate post-war period, where rates of political participation were high and Keynesian interventionism had temporarily secured what Crouch terms the ‘democratic economy’) continue to persist, but their existence is constantly pressured by processes that seek to empty them of democratic content. This means we are not seeing a return to the pre-democratic past, but rather the negation of its negation: a new synthesis combining elements of the 19th century elitist past with the ‘democratic moment’ of post-war democracy.
The chief characteristics of this post-democratic epoch is a stress on “electoral participation as the main type of mass [political] participation, extensive freedom for lobbying activities, which mainly means business lobbies, and a form of polity that avoids interfering with a capitalist economy.” Crouch suggests this “is a model that has little interest in widespread citizen involvement or the role of organizations outside the business sector” (p.3). He is careful to emphasise that post-democracy and the democratic moment are ideal-typical constructs useful for measuring the health of democracy. The subsequent political-economic analysis where these tools are deployed lays bare a number of worrying trends.
Taking the developmental patterns of the contemporary capitalist firm as the starting point, the global firm is increasingly subject to rapid turnovers of corporate identities owing to the speed which take-overs, buyouts, and reorganisations occur. Coupled with an unceasing appetite for casualised labour, companies have found it increasingly difficult to inculcate a sense of company identity as they seek to meet the imperatives of a fluidic global economy concerned with maximising shareholder value. For Crouch the ‘phantom firm’ sits at the apex of these developments. Drawing heavily on Klein’s (2000) critique, these corporations view their core business as a rigidity that can be sub-contracted to various entities around the globe. This allows the business to shrink its infrastructure to service the financial and strategic decision-making necessary for concentrating on the new core: branding.
This model of the flexible firm has been adopted by political elites as the path to good government. To illustrate, Western governments can and do divest themselves of the “core business” of running public services by sub-contracting these operations to the private sector. The problem here is that governments can lose the competence to run these services themselves, with a consequent increased reliance on private consultants. By its own actions, “the despised institution of government is tending to resolve itself into three parts: a number of activities which it tries increasingly to convert into market form; a dreary residual and burdensome set of obligations which the private sector will not take off its hands; and an image-creating purely political component” (p.42). These processes do not bode well for the health of democracy: it generates conditions for extremely close relations between government and business elites, where the latter enjoys almost unlimited access to the centre of political decision-making, and with it undue policy influence.
Recent changes in the class structure have fed into post-democratic developments. Crouch performs the contentious move of identifying the manual working class as the working class per se by opposing it to the “middle mass” of “diverse and heterogeneous groups of professionals, administrators, clerical and sales workers, employees of financial institutions, of public bureaucracies, and of welfare state organizations” (p.57). Nevertheless by considering these non-manual strata as a differentiated block Crouch is able to bring out the important divisions: the horizontal division into the public and private sectors, and the vertical split between senior managers and professionals who are structurally closer to capital and routine workers likely to experience proletarian conditions of employment. The identification by the upper tier of this stratum with capital aside, the political amorphousness of its subordinate levels has allowed politicians to read “lower middle class” concerns as identical to business interests, where they are addressed by politics as individuals, “encouraged to seek no means of improvement other than for themselves” (p.60).
This group represents a difficult problem for democrats – they are most susceptible to manipulation owing to the absence of an autonomous political profile and tend to be the most politically passive of groups, whilst being the most rapidly expanding sector in Western societies. Mainstream parties however have failed to acknowledge let alone address the real conditions of its everyday existence. For example, contemporary politics have shown a remarkable inability to grasp how downsizing, growing workloads, and mounting pressures on free time are common experiences of the workplace, meaning that discontent around these issues could escape ‘official’ frames. Such grievances might lead to a progressive politics animated by the “objective political agenda”, or find a populist rightwing expression - as the emergence of formations like the Lijst Pim Fortuyn and the United Kingdom Independence Party suggest.
This decoupling of politicians from their constituents is yet another outcome of the formers growing dependence on business elites. Drawing on models from political science in which a party leadership is surrounded by concentric circles of parliamentary representatives, activists, members, supporters, and the electorate, Crouch describes how this has given way to the elliptical-shaped leadership of the post-democratic party. This new core is composed of “conventional” party leaders and activists, an element motivated by money more than political commitment, and pure professionals drawn from outside the party’s ranks. Consequently the big money and media connections the ellipse provides helps partially overcome the previous reliance on activists (who tend to be rooted in a party’s constituency) for financial and electoral support, pointing toward a situation where political elites with their intimate ties to corporate interests are almost entirely self-reproducing. This is a situation where the lobbyist has pre-eminence over the citizen.
Simultaneously an outcome of and an aggravation toward post-democratic tendencies is the encroachment of capital into areas previously deemed essential to the quality of social citizenship, and therefore too important to be subject to commercial imperatives. Viewed in the context of capital’s insatiable appetite for markets, politicians may believe commercialisation promises better quality public services, but for Crouch the existence of not for profit welfare systems has acted as a significant break on the development of service provision as profitable markets. The role of the WTO and the IMF in the global dismantling of welfare states illustrates there is more behind privatisation than concerns for efficiency and consumer sovereignty.
What is to be done? Not a lot it seems. Crouch endorses a number of schemes designed to reactivate the citizenry, such as a democratised form of state funding for political parties, and the use of temporary citizen’s assemblies. He also suggests that “egalitarians” should stay alert for the emergence of new movements with democratising potentialities; make use of the post-democratic avenue of lobbying; but continue to work “critically and conditionally” through parties; and seeking ways to regulate the role of money in public life. Somewhat pessimistically Crouch concludes that these measures can only mitigate post-democratic processes: a major reversal is impossible. Anti-capitalist projects are unfeasible because “no-one has yet found an effective alternative to the capitalist firm for process and product innovation and for customer responsiveness where most goods and services are concerned” (p.105).
This is frustrating because it allows Crouch to turn away from the logical outcome of his arguments: a consistently democratic challenge to post-democracy. He notes “the tension between the egalitarian demands of democracy and the inequalities that result from capitalism can never be resolved …” (p.79), so why not overcome the split between politics and economics characteristic of capitalism (Wood 1995) by advocating a socialist project that subordinates more and more economic activity to democratic participation and regulation? Such a politics could potentially mobilise significant support by opposing privatisation with a democratic understanding of public ownership, whilst being rooted in the everyday “objective political agenda” post-democratic politics cannot address. In short and almost in spite of himself, Crouch shows how the struggle against post-democracy must necessarily be anti-capitalist if substantive democracy is to be the defining feature of 21st century politics.
At a time of falling participation in elections and a widespread scepticism toward parliamentary politics, Colin Crouch’s new book is a timely diagnosis of the key problems facing advanced liberal democracies.
As the title suggests, Crouch’s thesis is that Western democracies are approaching a condition of ‘post-democracy’. Evoking the metaphor of the parabola, Crouch argues the democratic forms established at the high point of liberal democracy (located in the immediate post-war period, where rates of political participation were high and Keynesian interventionism had temporarily secured what Crouch terms the ‘democratic economy’) continue to persist, but their existence is constantly pressured by processes that seek to empty them of democratic content. This means we are not seeing a return to the pre-democratic past, but rather the negation of its negation: a new synthesis combining elements of the 19th century elitist past with the ‘democratic moment’ of post-war democracy.
The chief characteristics of this post-democratic epoch is a stress on “electoral participation as the main type of mass [political] participation, extensive freedom for lobbying activities, which mainly means business lobbies, and a form of polity that avoids interfering with a capitalist economy.” Crouch suggests this “is a model that has little interest in widespread citizen involvement or the role of organizations outside the business sector” (p.3). He is careful to emphasise that post-democracy and the democratic moment are ideal-typical constructs useful for measuring the health of democracy. The subsequent political-economic analysis where these tools are deployed lays bare a number of worrying trends.
Taking the developmental patterns of the contemporary capitalist firm as the starting point, the global firm is increasingly subject to rapid turnovers of corporate identities owing to the speed which take-overs, buyouts, and reorganisations occur. Coupled with an unceasing appetite for casualised labour, companies have found it increasingly difficult to inculcate a sense of company identity as they seek to meet the imperatives of a fluidic global economy concerned with maximising shareholder value. For Crouch the ‘phantom firm’ sits at the apex of these developments. Drawing heavily on Klein’s (2000) critique, these corporations view their core business as a rigidity that can be sub-contracted to various entities around the globe. This allows the business to shrink its infrastructure to service the financial and strategic decision-making necessary for concentrating on the new core: branding.
This model of the flexible firm has been adopted by political elites as the path to good government. To illustrate, Western governments can and do divest themselves of the “core business” of running public services by sub-contracting these operations to the private sector. The problem here is that governments can lose the competence to run these services themselves, with a consequent increased reliance on private consultants. By its own actions, “the despised institution of government is tending to resolve itself into three parts: a number of activities which it tries increasingly to convert into market form; a dreary residual and burdensome set of obligations which the private sector will not take off its hands; and an image-creating purely political component” (p.42). These processes do not bode well for the health of democracy: it generates conditions for extremely close relations between government and business elites, where the latter enjoys almost unlimited access to the centre of political decision-making, and with it undue policy influence.
Recent changes in the class structure have fed into post-democratic developments. Crouch performs the contentious move of identifying the manual working class as the working class per se by opposing it to the “middle mass” of “diverse and heterogeneous groups of professionals, administrators, clerical and sales workers, employees of financial institutions, of public bureaucracies, and of welfare state organizations” (p.57). Nevertheless by considering these non-manual strata as a differentiated block Crouch is able to bring out the important divisions: the horizontal division into the public and private sectors, and the vertical split between senior managers and professionals who are structurally closer to capital and routine workers likely to experience proletarian conditions of employment. The identification by the upper tier of this stratum with capital aside, the political amorphousness of its subordinate levels has allowed politicians to read “lower middle class” concerns as identical to business interests, where they are addressed by politics as individuals, “encouraged to seek no means of improvement other than for themselves” (p.60).
This group represents a difficult problem for democrats – they are most susceptible to manipulation owing to the absence of an autonomous political profile and tend to be the most politically passive of groups, whilst being the most rapidly expanding sector in Western societies. Mainstream parties however have failed to acknowledge let alone address the real conditions of its everyday existence. For example, contemporary politics have shown a remarkable inability to grasp how downsizing, growing workloads, and mounting pressures on free time are common experiences of the workplace, meaning that discontent around these issues could escape ‘official’ frames. Such grievances might lead to a progressive politics animated by the “objective political agenda”, or find a populist rightwing expression - as the emergence of formations like the Lijst Pim Fortuyn and the United Kingdom Independence Party suggest.
This decoupling of politicians from their constituents is yet another outcome of the formers growing dependence on business elites. Drawing on models from political science in which a party leadership is surrounded by concentric circles of parliamentary representatives, activists, members, supporters, and the electorate, Crouch describes how this has given way to the elliptical-shaped leadership of the post-democratic party. This new core is composed of “conventional” party leaders and activists, an element motivated by money more than political commitment, and pure professionals drawn from outside the party’s ranks. Consequently the big money and media connections the ellipse provides helps partially overcome the previous reliance on activists (who tend to be rooted in a party’s constituency) for financial and electoral support, pointing toward a situation where political elites with their intimate ties to corporate interests are almost entirely self-reproducing. This is a situation where the lobbyist has pre-eminence over the citizen.
Simultaneously an outcome of and an aggravation toward post-democratic tendencies is the encroachment of capital into areas previously deemed essential to the quality of social citizenship, and therefore too important to be subject to commercial imperatives. Viewed in the context of capital’s insatiable appetite for markets, politicians may believe commercialisation promises better quality public services, but for Crouch the existence of not for profit welfare systems has acted as a significant break on the development of service provision as profitable markets. The role of the WTO and the IMF in the global dismantling of welfare states illustrates there is more behind privatisation than concerns for efficiency and consumer sovereignty.
What is to be done? Not a lot it seems. Crouch endorses a number of schemes designed to reactivate the citizenry, such as a democratised form of state funding for political parties, and the use of temporary citizen’s assemblies. He also suggests that “egalitarians” should stay alert for the emergence of new movements with democratising potentialities; make use of the post-democratic avenue of lobbying; but continue to work “critically and conditionally” through parties; and seeking ways to regulate the role of money in public life. Somewhat pessimistically Crouch concludes that these measures can only mitigate post-democratic processes: a major reversal is impossible. Anti-capitalist projects are unfeasible because “no-one has yet found an effective alternative to the capitalist firm for process and product innovation and for customer responsiveness where most goods and services are concerned” (p.105).
This is frustrating because it allows Crouch to turn away from the logical outcome of his arguments: a consistently democratic challenge to post-democracy. He notes “the tension between the egalitarian demands of democracy and the inequalities that result from capitalism can never be resolved …” (p.79), so why not overcome the split between politics and economics characteristic of capitalism (Wood 1995) by advocating a socialist project that subordinates more and more economic activity to democratic participation and regulation? Such a politics could potentially mobilise significant support by opposing privatisation with a democratic understanding of public ownership, whilst being rooted in the everyday “objective political agenda” post-democratic politics cannot address. In short and almost in spite of himself, Crouch shows how the struggle against post-democracy must necessarily be anti-capitalist if substantive democracy is to be the defining feature of 21st century politics.
Rabu, 12 Disember 2012
Oblivion, or, The Ruins of Future Earth
Oblivion is the next Tom Cruise vehicle. It looks rather nice.
Tom Cruise is a silly man with even sillier beliefs. But he has appeared in some intelligent and relatively plausible science fiction flicks over the years, and this must count as his first foray into the post-apocalypse (War of the Worlds is firmly in 'apocalypse now' territory).
Oblivion is chock full of the sublime aesthetic of reclaimed, deserted cityscapes; and the ergonomically terrific death-dealing drones are suitably Apple-esque. It's the Futurist Manifesto on screen - a cinematic invitation to gawp at the beauty of war.
The plot, such as it is, appears a variation on the Morlock/Eloi theme with aliens and robotic hunter/killers thrown into the mix. Entirely coincidentally I'm sure, it could easily lend itself to a first-person shooter video game adaptation too.
It's due out next April. I think this blog will be going to the cinema then.
Tom Cruise is a silly man with even sillier beliefs. But he has appeared in some intelligent and relatively plausible science fiction flicks over the years, and this must count as his first foray into the post-apocalypse (War of the Worlds is firmly in 'apocalypse now' territory).
Oblivion is chock full of the sublime aesthetic of reclaimed, deserted cityscapes; and the ergonomically terrific death-dealing drones are suitably Apple-esque. It's the Futurist Manifesto on screen - a cinematic invitation to gawp at the beauty of war.
The plot, such as it is, appears a variation on the Morlock/Eloi theme with aliens and robotic hunter/killers thrown into the mix. Entirely coincidentally I'm sure, it could easily lend itself to a first-person shooter video game adaptation too.
It's due out next April. I think this blog will be going to the cinema then.
Isnin, 10 Disember 2012
Why Labour Must Oppose Benefit Cuts
Controversial. Ed Miliband has indicated he will be taking this LibDem-supported Tory government on over real term cuts to benefits. It's a risky move, but a necessary one for the long-term future of Labour politics.
It's got some folks in a flap. Paid-for right-wing dissenter Dan Hodges says the move typifies EdM's "political immaturity". An immaturity, it has to be said, that has seen consistent healthy poll leads and three recent difficult by-elections in which Labour increased its total vote share. None too shabby. While it's easy to write off Hodges' concerns (let's face it, he'd be painting David as a wild-eyed leftist had the other Miliband won), his argument that the route to power isn't through increased welfare spending is received wisdom shared by many Labour people, not just Blairite Ultras.
It's not difficult to understand why. Apart from the non-stop media onslaught against skivers and scroungers living the high life on unemployment and disability benefits, the very idea someone is skipping work due thanks to your taxes is a powerful one. As I said a little while ago, "it's the very idea that someone, somewhere is living off the backs of your labour without making a useful contribution themselves; that someone is is living a profligate, responsibility-free life while you work to make ends meet and watch every penny you spend. And it's the idea someone is getting away with it while you're being taken for a mug." Welfare is out. Benefit-bashing is in.
As Hodges observes with a sneer, "Miliband is going to try to argue that his stance on welfare is somehow motivated by a desire to stand with “the strivers”. But at the same time as he’s opposing a freeze on welfare, he’s actually advocating a freeze on public sector pay. Is this seriously going to be Miliband’s offer in 2015?" A tricky position to be sure, and one eased by abandoning the current position on the pay freeze. Politically and economically it makes sense to u-turn on this one - the present crisis requires money be put in people's pockets to ameliorate the spiralling cost of living and giving personal consumption a much-needed boost.
But back to the matter at hand. Whether EdM extricates himself from that sticky patch remains to be seen, but opposing welfare cuts is still the right thing to do and isn't necessarily the political own goal Hodges thinks it is. Leaving aside the moral obscenity of denying money to society's very poorest (especially in the context of income and corporation tax cuts for the rich), and the economic case against doing so, EdM's opposition to benefit cuts could represent a breach in the prevailing political order.
When it comes to welfare, Labour can be beastly to benefit recipients - up to a point. We've done a pretty good job of making the lives of the poorest pretty rough too. Ours is the party of the hated Work Capability Assessment and Atos-run assessment centres, though the Conservatives have made these hideous policies very much their own. But ultimately, the Tories and their UKIP mini-me can always undercut Labour in the race to the gutter.
If the right are in disarray, Labour can reap short-term electoral benefits from attacking welfare. And this is how it operated in the Blair years. But when Conservatives are in power they will always outgun Labour in a sanctions arms race. You can either accept that state of affairs - as the Dan Hodges of this world would have us do, (though for strictly pragmatic reasons of course), or try and change the terms of the debate. EdM's opposition could fall in the latter category.
In all likelihood, his attacks on the Tories will focus on the benefits for the in-work. So, the scandal of working people having to top up their wages with tax credits because their pay is too low is a relatively easy way of turning the debate against the Tories. Likewise with housing benefit, Labour's groping toward rent controls is a way of focusing debate on the real reasons for the increasing benefit bill - the unregulated private rental market.
Modest steps, but important steps. It allows us to call these benefits what they really are - subsidies for low wages and buy-to-let. In the cold light of day, who can object to these tax subsidies getting countered by state and social movement activism around living wages and market regulation? The Tories would be all at sea in a welfare debate argued in these terms.
But if you like, these are the easier benefits to argue over. Countering scrounger rhetoric on unemployment and disability is tougher, even though people are forced to depend on them thanks to market failure more than any other reason. But the game of politics is about picking and choosing your battles, and so far EdM has proven himself a rather astute player. Hence instead of victimisation, the two Eds often talk about reducing the benefit bill by fixing the economy. This line is underdeveloped so far, but again is one the government are vulnerable on.
Ultimately, the benefits battle is one Labour cannot duck. The well of resentment around this issue is absolutely toxic. We have to begin the clear up by addressing the real causes and assuaging concerns, not pandering to them. Just as the most compelling argument against violent revolution is the observation you can't build socialism out of a pile of bones; a fairer, tolerant social democratic society cannot arise from a swamp of resentment, bitterness and beggar-my-neighbour. If politics is to change for the better, if this century is to be the Labour Century, fear and spite must be challenged in the strongholds 30 years of neoliberal consensus has built. And it would appear this is something EdM understands.
It's got some folks in a flap. Paid-for right-wing dissenter Dan Hodges says the move typifies EdM's "political immaturity". An immaturity, it has to be said, that has seen consistent healthy poll leads and three recent difficult by-elections in which Labour increased its total vote share. None too shabby. While it's easy to write off Hodges' concerns (let's face it, he'd be painting David as a wild-eyed leftist had the other Miliband won), his argument that the route to power isn't through increased welfare spending is received wisdom shared by many Labour people, not just Blairite Ultras.
It's not difficult to understand why. Apart from the non-stop media onslaught against skivers and scroungers living the high life on unemployment and disability benefits, the very idea someone is skipping work due thanks to your taxes is a powerful one. As I said a little while ago, "it's the very idea that someone, somewhere is living off the backs of your labour without making a useful contribution themselves; that someone is is living a profligate, responsibility-free life while you work to make ends meet and watch every penny you spend. And it's the idea someone is getting away with it while you're being taken for a mug." Welfare is out. Benefit-bashing is in.
As Hodges observes with a sneer, "Miliband is going to try to argue that his stance on welfare is somehow motivated by a desire to stand with “the strivers”. But at the same time as he’s opposing a freeze on welfare, he’s actually advocating a freeze on public sector pay. Is this seriously going to be Miliband’s offer in 2015?" A tricky position to be sure, and one eased by abandoning the current position on the pay freeze. Politically and economically it makes sense to u-turn on this one - the present crisis requires money be put in people's pockets to ameliorate the spiralling cost of living and giving personal consumption a much-needed boost.
But back to the matter at hand. Whether EdM extricates himself from that sticky patch remains to be seen, but opposing welfare cuts is still the right thing to do and isn't necessarily the political own goal Hodges thinks it is. Leaving aside the moral obscenity of denying money to society's very poorest (especially in the context of income and corporation tax cuts for the rich), and the economic case against doing so, EdM's opposition to benefit cuts could represent a breach in the prevailing political order.
When it comes to welfare, Labour can be beastly to benefit recipients - up to a point. We've done a pretty good job of making the lives of the poorest pretty rough too. Ours is the party of the hated Work Capability Assessment and Atos-run assessment centres, though the Conservatives have made these hideous policies very much their own. But ultimately, the Tories and their UKIP mini-me can always undercut Labour in the race to the gutter.
If the right are in disarray, Labour can reap short-term electoral benefits from attacking welfare. And this is how it operated in the Blair years. But when Conservatives are in power they will always outgun Labour in a sanctions arms race. You can either accept that state of affairs - as the Dan Hodges of this world would have us do, (though for strictly pragmatic reasons of course), or try and change the terms of the debate. EdM's opposition could fall in the latter category.
In all likelihood, his attacks on the Tories will focus on the benefits for the in-work. So, the scandal of working people having to top up their wages with tax credits because their pay is too low is a relatively easy way of turning the debate against the Tories. Likewise with housing benefit, Labour's groping toward rent controls is a way of focusing debate on the real reasons for the increasing benefit bill - the unregulated private rental market.
Modest steps, but important steps. It allows us to call these benefits what they really are - subsidies for low wages and buy-to-let. In the cold light of day, who can object to these tax subsidies getting countered by state and social movement activism around living wages and market regulation? The Tories would be all at sea in a welfare debate argued in these terms.
But if you like, these are the easier benefits to argue over. Countering scrounger rhetoric on unemployment and disability is tougher, even though people are forced to depend on them thanks to market failure more than any other reason. But the game of politics is about picking and choosing your battles, and so far EdM has proven himself a rather astute player. Hence instead of victimisation, the two Eds often talk about reducing the benefit bill by fixing the economy. This line is underdeveloped so far, but again is one the government are vulnerable on.
Ultimately, the benefits battle is one Labour cannot duck. The well of resentment around this issue is absolutely toxic. We have to begin the clear up by addressing the real causes and assuaging concerns, not pandering to them. Just as the most compelling argument against violent revolution is the observation you can't build socialism out of a pile of bones; a fairer, tolerant social democratic society cannot arise from a swamp of resentment, bitterness and beggar-my-neighbour. If politics is to change for the better, if this century is to be the Labour Century, fear and spite must be challenged in the strongholds 30 years of neoliberal consensus has built. And it would appear this is something EdM understands.
Label:
Class,
Disability,
Labour,
Politics,
Social Security
Ahad, 9 Disember 2012
The Difficulty Writing About Video Games
I'm outing myself as a video gaming pseud. I played them feverishly as a kid, completely lost interest in my 20s, and now only occasionally plunge into the odd retro game. My taste is mainly in the games of the late 80s and early 90s. I don't get on with the sweeping, cinematic masterpieces that dominate modern gaming, nor am I keen on so-called casual games, which appear to me as so many latter day retreads of proper old school 8 bit computer titles. That's me, those are my tastes.
Despite having almost zero interest in the PS3, the 360, the Wii (and Wii U), I'm not so down with the old school gang to realise gaming has a popularity magnitudes greater than the reach the C64 and Speccy once had. Some credit/blame the original Playstation for properly taking games to the masses. As a Sega fan boy, I think Sonic making the Mega Drive cool did the necessary spadework in the UK. But whatever. They have been an utterly mainstream pursuit for about 20 years. So I was amazed when I found nothing on the cultural studies/critiques of video games a decade ago for a deconstructive assignment I was writing (bloggified version here); and was even more gobsmacked to recently hear that little has changed.
Writing in the New Statesman, (Why are we still so bad at talking about video games?), Helen Lewis asks "whether the lack of a serious cultural conversation about games is holding back innovation" and "will we ever move beyond previews and reviews?" As you might expect, it prompted a bit of a response - this for example from Brendan Keogh points to where more sophisticated pieces can be found, which includes Critical Distance - a blog that regularly features what's interesting in the world of video game blogging.
As well as these, there are a few recommendations of my own. Something would be amiss if YouTube commentary didn't get a shout, so I heartily recommend the remarkable Chrontendo project - one man's quest to play and provide commentary on every single NES, Master System, PC Engine, Mega Drive and Super Nintendo game. He's currently up to summer 1989. There's also of course the wonderful Retro Gamer magazine that always has plenty to say on the classics of yesteryear, and the odd chin-strokey piece in Edge. I swear I saw an article in there the other month about Americana and the uses of the mobile phone in GTA IV.
All this, however, happens at the margins. Helen's basic point is that mainstream video game writing is strictly evaluative and, as such, is a function of the industry's political economy. I agree. The preview culture of video game magazines and websites has long been the means by which companies generate interest and, hopefully, create a head of hype around their product. By having trusted professional game reviewers confirm the spin and excitement through the award of a high score, companies are gifted a powerful marketing tool. Readers of a certain age may recall occasional Codemaster titles having real (and imagined) accolades plastered over their cassettes, for example.
On the consumption side of things, the mass market wants to know what's hot and what's rot. Once video games started getting really pricey in the early 90s, consumers - at least those who weren't casual or inexperienced gamers likely to be seduced by a flashy licence or nice packaging - needed video game writing to be a buyer's guide to separate out the quality from the highly-priced dross. This remains true, even though the newest releases can usually be picked up second hand on the high street relatively reasonably some six months after their release. Paying out a tenner for a crap game makes you feel cheated in ways going to see an awful film or forking out for a tedious book does not.
And because there's a market for it, the standard approach to video game writing is where the money and interest's at. If in the week after its release I write a blog post on zombies and ideology in Naughty Dog's upcoming The Last of Us, I am sure that would generate far less traffic than a straight review. In other words, the total complex of video game buying and playing is materially weighted against critical writing.
This brings me to the first of a couple of additional (historical) reasons why video game writing is stuck in a rut. If you date the modern video game industry from the first popular home systems of the late 70s, between then and the establishment of today's three-dimensional gaming standard on computers and consoles from the mid-90s on, the overwhelming majority of games were, well, simple. The first home console, the Magnavox Odyssey offered variations on the Pong theme. The subsequent generation of consoles and computers were similarly underpowered - see this Atari VCS conversion of the technologically undemanding original Pac-Man, for instance. Compare this simplicity with the first modern novels, or first films. We are talking about completely different audience experiences demanding different sets of engagement practices. Novels and films are (or, at least, according to their own specific disciplines of literary and film criticism) supposed to convey a story. Apart from adventure games, RPGs, and early Japanese visual novels from this period, narrative is (or rather, was) ancillary to gameplay.
The hegemonic game form throughout the 8 and 16 bit computer and console era was based loosely around the arcade experience. Whether fighting, shooting, platforming, or puzzling, the vast majority of video games were basically no more than picking up your controller and getting on with the action. The enduring appeal of that era's Super Mario Bros and Sonic the Hedgehog games was the marrying of accessible play to imaginative and compelling game design. This was reflective of the arcade gaming experience as a whole. To keep the 10, 20, and 50p coins rolling into the machine a game (ideally) had to be visually appealing, and offer short, challenging but compulsive gaming experiences. Hence games leant themselves to instrumental, evaluative forms of writing. But the digital architecture of the games resisted the sorts of textual criticism found elsewhere. The hegemonic video game type was fundamentally depthless, the postmodern cultural product par excellence. Take these two examples. Is there much more that can be said about the culture curled up inside their code?
That said, the old school games weren't necessarily a mindless experience. Sans elaborate plot lines and breath taking cinematics, players' imaginations often stepped in and imposed entirely subjective and somewhat arbitrary narratives on their play experience. Mercs was probably my most-played Mega Drive game back in the day. It was a good blast in the Commando mould and I had played it to death dozens of times. But on all of those occasions I fought the mission according to my own ridiculous story line over and above the official narrative of rescuing the US President (and no, before you ask, I did not see myself as the gun toting protagonist - despite sharing the blond hair and muscular physique). So the complex social-psychological work of interpretation long identified in readers and filmgoers by 'professional criticism' was at work here too; and given the interactive nature of gaming, you could probably make the argument that the interpretative meaning-making process was a level of complexity above reading and watching.
Unfortunately, the apparent depthlessness and surface resistance to critical writing then has arrested the development of a specific language of video game criticism. But it is more than just a matter of 'catching up'. Game criticism's underdevelopment meant a potential entry way into the academy could not be taken.
In the time between Atari's VCS and Sony's Playstation, the humanities in HE enjoyed continued institutional expansion. Sociology's place was confronted with a brasher and self-consciously cutting edge upstart: Cultural Studies. Born out of Birmingham's late and very much lamented Centre for Contemporary Cultural Studies, and variously associated with the work of Stuart Hall and Raymond Williams, by the end of the 80s Cultural Studies was the most thoroughly postmodernised of the social sciences. It was a crucial disciplinary interface between academia and activism where identity questions around gender, ethnicity and sexuality were concerned, and helped make French poststructuralism the season's intellectual de rigueur. By this time, Marxism (neo or otherwise) was out. Economic determination in the last instance was unceremoniously dethroned by the tyranny of the cultural text in the first instance.
I digress.
As Cultural Studies expanded, and particularly after the wave of post-1992 new universities, this was the ideal moment for video game criticism to establish itself as a niche within the academic landscape, and provide the kind of institutional sacralisation enjoyed by literary and film criticism. But it didn't happen. It would be too much to talk of a missed opportunity, because that implies it could possibly have happened. I would say it didn't as video games then were very much a generational thing (though chances are younger academics going into Cultural Studies had a games machine, or at least had played games at some point). They were also more niche, though that was changing with the arrival of glamorous consoles from Japan. But more than anything else, as a cultural phenomena they were marginal to Cultural Studies debates around postmodernism, consumption, identity formation and the like. To illustrate, The Cybercultures Reader edited by David Bell and Barbara Kennedy (2000) brought together key contributions to (cyber)cultural theory in 768 pages. It contains many interesting pieces on theorising subjectivity and identity construction on the internet. It also only has seven index entries for games, of which six mentions appear in one article, and then as an auxiliary to the main discussion. It is amazing to think a canonical collection on a topic of this nature did not carry one piece on gaming - this, after the games consoles of the preceding generation had achieved then record combined global hardware sales of around 140 million (excluding handhelds). I don't blame the editors. They, after all, only put together a reader that reflected the key concerns of Cultural Studies at that time.
Sadly, Cultural Studies as a discipline now appears to lead a subterranean existence in UK university media studies and sociology departments. If video game criticism is to get an academic imprimatur, it will have to be another route. But it won't happen until it becomes a strategic sector within one or more disciplinary fields of the social sciences and/or the arts (but it's not stopping some from trying).
Helen wraps up her piece by noting "perhaps [the writing] revolution in games criticism will never happen". I doubt it too, unless a material interest somewhere in the industry, academia, or journalism coheres around it. Unfortunately, it will be some time before Zelda or Metroid command a lead essay in the video game equivalent of The London Review of Books.
Despite having almost zero interest in the PS3, the 360, the Wii (and Wii U), I'm not so down with the old school gang to realise gaming has a popularity magnitudes greater than the reach the C64 and Speccy once had. Some credit/blame the original Playstation for properly taking games to the masses. As a Sega fan boy, I think Sonic making the Mega Drive cool did the necessary spadework in the UK. But whatever. They have been an utterly mainstream pursuit for about 20 years. So I was amazed when I found nothing on the cultural studies/critiques of video games a decade ago for a deconstructive assignment I was writing (bloggified version here); and was even more gobsmacked to recently hear that little has changed.
Writing in the New Statesman, (Why are we still so bad at talking about video games?), Helen Lewis asks "whether the lack of a serious cultural conversation about games is holding back innovation" and "will we ever move beyond previews and reviews?" As you might expect, it prompted a bit of a response - this for example from Brendan Keogh points to where more sophisticated pieces can be found, which includes Critical Distance - a blog that regularly features what's interesting in the world of video game blogging.
As well as these, there are a few recommendations of my own. Something would be amiss if YouTube commentary didn't get a shout, so I heartily recommend the remarkable Chrontendo project - one man's quest to play and provide commentary on every single NES, Master System, PC Engine, Mega Drive and Super Nintendo game. He's currently up to summer 1989. There's also of course the wonderful Retro Gamer magazine that always has plenty to say on the classics of yesteryear, and the odd chin-strokey piece in Edge. I swear I saw an article in there the other month about Americana and the uses of the mobile phone in GTA IV.
All this, however, happens at the margins. Helen's basic point is that mainstream video game writing is strictly evaluative and, as such, is a function of the industry's political economy. I agree. The preview culture of video game magazines and websites has long been the means by which companies generate interest and, hopefully, create a head of hype around their product. By having trusted professional game reviewers confirm the spin and excitement through the award of a high score, companies are gifted a powerful marketing tool. Readers of a certain age may recall occasional Codemaster titles having real (and imagined) accolades plastered over their cassettes, for example.
On the consumption side of things, the mass market wants to know what's hot and what's rot. Once video games started getting really pricey in the early 90s, consumers - at least those who weren't casual or inexperienced gamers likely to be seduced by a flashy licence or nice packaging - needed video game writing to be a buyer's guide to separate out the quality from the highly-priced dross. This remains true, even though the newest releases can usually be picked up second hand on the high street relatively reasonably some six months after their release. Paying out a tenner for a crap game makes you feel cheated in ways going to see an awful film or forking out for a tedious book does not.
And because there's a market for it, the standard approach to video game writing is where the money and interest's at. If in the week after its release I write a blog post on zombies and ideology in Naughty Dog's upcoming The Last of Us, I am sure that would generate far less traffic than a straight review. In other words, the total complex of video game buying and playing is materially weighted against critical writing.
This brings me to the first of a couple of additional (historical) reasons why video game writing is stuck in a rut. If you date the modern video game industry from the first popular home systems of the late 70s, between then and the establishment of today's three-dimensional gaming standard on computers and consoles from the mid-90s on, the overwhelming majority of games were, well, simple. The first home console, the Magnavox Odyssey offered variations on the Pong theme. The subsequent generation of consoles and computers were similarly underpowered - see this Atari VCS conversion of the technologically undemanding original Pac-Man, for instance. Compare this simplicity with the first modern novels, or first films. We are talking about completely different audience experiences demanding different sets of engagement practices. Novels and films are (or, at least, according to their own specific disciplines of literary and film criticism) supposed to convey a story. Apart from adventure games, RPGs, and early Japanese visual novels from this period, narrative is (or rather, was) ancillary to gameplay.
The hegemonic game form throughout the 8 and 16 bit computer and console era was based loosely around the arcade experience. Whether fighting, shooting, platforming, or puzzling, the vast majority of video games were basically no more than picking up your controller and getting on with the action. The enduring appeal of that era's Super Mario Bros and Sonic the Hedgehog games was the marrying of accessible play to imaginative and compelling game design. This was reflective of the arcade gaming experience as a whole. To keep the 10, 20, and 50p coins rolling into the machine a game (ideally) had to be visually appealing, and offer short, challenging but compulsive gaming experiences. Hence games leant themselves to instrumental, evaluative forms of writing. But the digital architecture of the games resisted the sorts of textual criticism found elsewhere. The hegemonic video game type was fundamentally depthless, the postmodern cultural product par excellence. Take these two examples. Is there much more that can be said about the culture curled up inside their code?
That said, the old school games weren't necessarily a mindless experience. Sans elaborate plot lines and breath taking cinematics, players' imaginations often stepped in and imposed entirely subjective and somewhat arbitrary narratives on their play experience. Mercs was probably my most-played Mega Drive game back in the day. It was a good blast in the Commando mould and I had played it to death dozens of times. But on all of those occasions I fought the mission according to my own ridiculous story line over and above the official narrative of rescuing the US President (and no, before you ask, I did not see myself as the gun toting protagonist - despite sharing the blond hair and muscular physique). So the complex social-psychological work of interpretation long identified in readers and filmgoers by 'professional criticism' was at work here too; and given the interactive nature of gaming, you could probably make the argument that the interpretative meaning-making process was a level of complexity above reading and watching.
Unfortunately, the apparent depthlessness and surface resistance to critical writing then has arrested the development of a specific language of video game criticism. But it is more than just a matter of 'catching up'. Game criticism's underdevelopment meant a potential entry way into the academy could not be taken.
In the time between Atari's VCS and Sony's Playstation, the humanities in HE enjoyed continued institutional expansion. Sociology's place was confronted with a brasher and self-consciously cutting edge upstart: Cultural Studies. Born out of Birmingham's late and very much lamented Centre for Contemporary Cultural Studies, and variously associated with the work of Stuart Hall and Raymond Williams, by the end of the 80s Cultural Studies was the most thoroughly postmodernised of the social sciences. It was a crucial disciplinary interface between academia and activism where identity questions around gender, ethnicity and sexuality were concerned, and helped make French poststructuralism the season's intellectual de rigueur. By this time, Marxism (neo or otherwise) was out. Economic determination in the last instance was unceremoniously dethroned by the tyranny of the cultural text in the first instance.
I digress.
As Cultural Studies expanded, and particularly after the wave of post-1992 new universities, this was the ideal moment for video game criticism to establish itself as a niche within the academic landscape, and provide the kind of institutional sacralisation enjoyed by literary and film criticism. But it didn't happen. It would be too much to talk of a missed opportunity, because that implies it could possibly have happened. I would say it didn't as video games then were very much a generational thing (though chances are younger academics going into Cultural Studies had a games machine, or at least had played games at some point). They were also more niche, though that was changing with the arrival of glamorous consoles from Japan. But more than anything else, as a cultural phenomena they were marginal to Cultural Studies debates around postmodernism, consumption, identity formation and the like. To illustrate, The Cybercultures Reader edited by David Bell and Barbara Kennedy (2000) brought together key contributions to (cyber)cultural theory in 768 pages. It contains many interesting pieces on theorising subjectivity and identity construction on the internet. It also only has seven index entries for games, of which six mentions appear in one article, and then as an auxiliary to the main discussion. It is amazing to think a canonical collection on a topic of this nature did not carry one piece on gaming - this, after the games consoles of the preceding generation had achieved then record combined global hardware sales of around 140 million (excluding handhelds). I don't blame the editors. They, after all, only put together a reader that reflected the key concerns of Cultural Studies at that time.
Sadly, Cultural Studies as a discipline now appears to lead a subterranean existence in UK university media studies and sociology departments. If video game criticism is to get an academic imprimatur, it will have to be another route. But it won't happen until it becomes a strategic sector within one or more disciplinary fields of the social sciences and/or the arts (but it's not stopping some from trying).
Helen wraps up her piece by noting "perhaps [the writing] revolution in games criticism will never happen". I doubt it too, unless a material interest somewhere in the industry, academia, or journalism coheres around it. Unfortunately, it will be some time before Zelda or Metroid command a lead essay in the video game equivalent of The London Review of Books.
Patrick Moore, 1923 - 2012
Very sorry to hear about Patrick Moore. He may have been an old xenophobe and UKIP botherer, but Patrick will be missed by millions of science fans. The Sky at Night will never be the same again. Nor xylophony.
Sabtu, 8 Disember 2012
No More Tears (Enough is Enough)
It's time for an interlude. A camp interlude:
Jumaat, 7 Disember 2012
Show Some Respect
No sooner has it been announced that the nurse who took that prank radio call was found dead earlier today, the Twitterati have been piling in with the glee of the ignorant and the self righteous.
Quite possibly the worst contribution I've seen is the following (I've kept the author anonymous to protect him from his own stupidity). He wrote "I hope those Australian DJs have a guilty conscience for the rest of their lives. Disgusting. #RIP"
No doubt the DJs in question will thanks to the efforts of dumb, callous comments like this. You've got to ask what's going on in someone's head that makes them want to use the death of another human being to lash out at others they have no connection to, and are guilty of nothing more than pulling off a successful prank against the royal household.
If people really gave a shit about the poor woman who died, they should shut the fuck up. No one knows what was going on in her life, full stop. FFS, we don't even know if the woman is a victim of suicide. All we do know is someone has passed away under apparently tragic circumstances, and that the family and friends she leaves behind will be grieving. If anyone properly cares they will respect her memory and allow the people who knew her to come to terms with their loss. The rest can wait for the inquest.
Anything on top of that, the speculation, the axe-grinding, it's more than distasteful. It's downright sick.
Quite possibly the worst contribution I've seen is the following (I've kept the author anonymous to protect him from his own stupidity). He wrote "I hope those Australian DJs have a guilty conscience for the rest of their lives. Disgusting. #RIP"
No doubt the DJs in question will thanks to the efforts of dumb, callous comments like this. You've got to ask what's going on in someone's head that makes them want to use the death of another human being to lash out at others they have no connection to, and are guilty of nothing more than pulling off a successful prank against the royal household.
If people really gave a shit about the poor woman who died, they should shut the fuck up. No one knows what was going on in her life, full stop. FFS, we don't even know if the woman is a victim of suicide. All we do know is someone has passed away under apparently tragic circumstances, and that the family and friends she leaves behind will be grieving. If anyone properly cares they will respect her memory and allow the people who knew her to come to terms with their loss. The rest can wait for the inquest.
Anything on top of that, the speculation, the axe-grinding, it's more than distasteful. It's downright sick.
Label:
The Internets
Langgan:
Catatan (Atom)