Showing posts with label New York. Show all posts
Showing posts with label New York. Show all posts

Sunday, November 20, 2011

Google News: Pilot stuck in lavatory prompts terror scare

LaGuardia Airport - Queens, New YorkImage by dougtone via Flickr
Google News
CNN - ‎Nov 18, 2011‎
By A. Pawlowski, CNN The plane made an emergency landing at New York's LaGuardia Airport with the pilot at the controls. (CNN) -- A pilot stuck in the lavatory may sound like the opening line of a joke, but it triggered a terror scare on a flight from ...
See all 338 sources »



Browse all of today's headlines on Google News
Enhanced by Zemanta

Saturday, October 29, 2011

Google News: Lady Liberty turns 125

Google News
CNN - ‎2 hours ago‎
New York (CNN) -- As snow fell across New York Harbor, Isabel Belarsky clutched her mother, Clara, aboard a passenger ship that puttered toward Ellis Island and wondered what their new lives would bring.
See all 504 sources »



Browse all of today's headlines on Google News
Enhanced by Zemanta

Tuesday, October 4, 2011

New Wave

Iron Maiden live in Barcelona, 30 November 200...Image via Wikipedia
See also: New Romantics and Synthpop


Deborah Harry from the band Blondie, performing at Maple Leaf Gardens in Toronto in 1977
Although punk rock was a significant social and musical phenomenon, it achieved less in the way of record sales (being distributed by small specialty labels such as Stiff Records),[154] or American radio airplay (as the radio scene continued to be dominated by mainstream formats such as disco and album-oriented rock).[155] Punk rock had attracted devotees from the art and collegiate world and soon bands sporting a more literate, arty approach, such as Talking Heads, and Devo began to infiltrate the punk scene; in some quarters the description "New Wave" began to be used to differentiate these less overtly punk bands.[156] Record executives, who had been mostly mystified by the punk movement, recognized the potential of the more accessible New Wave acts and began aggressively signing and marketing any band that could claim a remote connection to punk or New Wave.[157] Many of these bands, such as The Cars, and The Go-Go's can be seen as pop bands marketed as New Wave;[158] other existing acts, including The Police, The Pretenders and Elvis Costello, used the New Wave movement as the springboard for relatively long and critically successful careers,[159] while "skinny tie" bands exemplified by The Knack,[160] or the photogenic Blondie, began as punk acts and moved into more commercial territory.[161]
Between 1982 and 1985, influenced by Kraftwerk, David Bowie, and Gary Numan, British New Wave went in the direction of such New Romantics as Spandau Ballet, Ultravox, Duran Duran, A Flock of Seagulls, Culture Club, Talk Talk and the Eurythmics, sometimes using the synthesizer to replace all other instruments.[162] This period coincided with the rise of MTV and led to a great deal of exposure for this brand of synthpop, creating what has been characterised as a second British Invasion.[163] Some more traditional rock bands adapted to the video age and profited from MTV's airplay, most obviously Dire Straits', whose "Money for Nothing" gently poked fun at the station, despite the fact that it had helped make them international stars,[164] but in general guitar-oriented rock was commercially eclipsed.[165]
[edit]Post-punk
Main article: Post-punk
See also: Gothic rock and Industrial music


U2 performing at Madison Square Garden in November 2005
If hardcore most directly pursued the stripped down aesthetic of punk, and New Wave came to represent its commercial wing, post-punk emerged in the later 1970s and early '80s as its more artistic and challenging side. Major influences beside punk bands were The Velvet Underground, The Who, Frank Zappa and Captain Beefheart, and the New York based no wave scene which placed an emphasis on performance, including bands such as James Chance and the Contortions, DNA and Sonic Youth.[166] Early contributors to the genre included the US bands Pere Ubu, Devo, The Residents and Talking Heads.[166]
The first wave of British post-punk included Gang of Four, Siouxsie and the Banshees and Joy Division, who placed less emphasis on art than their US counterparts and more on the dark emotional qualities of their music.[166] Bands like Siouxsie and the Banshees, Bauhaus, The Cure, and The Sisters of Mercy, moved increasingly in this direction to found Gothic rock, which had become the basis of a major sub-culture by the early 1980s.[167] Similar emotional territory was pursued by Australian acts like The Birthday Party and Nick Cave.[166] Members of Bauhaus and Joy Division explored new stylistic territory as Love and Rockets and New Order respectively.[166] Another early post-punk movement was the industrial music[168] developed by British bands Throbbing Gristle and Cabaret Voltaire, and New York-based Suicide, using a variety of electronic and sampling techniques that emulated the sound of industrial production and which would develop into a variety of forms of post-industrial music in the 1980s.[169]
The second generation of British post-punk bands that broke through in the early 1980s, including The Fall, The Pop Group, The Mekons, Echo and the Bunnymen and Teardrop Explodes, tended to move away from dark sonic landscapes.[166] Arguably the most successful band to emerge from post-punk was Ireland's U2, who incorporated elements of religious imagery together with political commentary into their often anthemic music, and by the late 1980s had become one of the biggest bands in the world.[170] Although many post-punk bands continued to record and perform, it declined as a movement in the mid-1980s as acts disbanded or moved off to explore other musical areas, but it has continued to influence the development of rock music and has been seen as a major element in the creation of the alternative rock movement.[171]
[edit]New waves and genres in heavy metal
Main article: Heavy metal music
See also: NWOBHM, Glam metal, and Extreme metal


Iron Maiden, one of the central bands in the New Wave of British Heavy Metal, performing in Barcelona in 2006
Although many established bands continued to perform and record, heavy metal suffered a hiatus in the face of the punk movement in the mid-1970s. Part of the reaction saw the popularity of bands like Motörhead, who had adopted a punk sensibility, and Judas Priest, who created a stripped down sound, largely removing the remaining elements of blues music, from their 1978 album Stained Class.[172] This change of direction was compared to punk and in the late 1970s became known as the New Wave of British Heavy Metal (NWOBHM).[173] These bands were soon followed by acts including Iron Maiden, Vardis, Diamond Head, Saxon, Def Leppard and Venom, many of which began to enjoy considerable success in the USA.[174] In the same period Eddie Van Halen established himself as a metal guitar virtuoso after his band's self-titled 1978 album.[175] Randy Rhoads and Yngwie Malmsteen also became established virtuosos, associated with what would be known as the neoclassical metal style.[176]
Inspired by NWOBHM and Van Halen's success, a metal scene began to develop in Southern California from the late 1970s, based on the clubs of L.A.'s Sunset Strip and including such bands as Quiet Riot, Ratt, Mötley Crüe, and W.A.S.P., who, along with similarly styled acts such as New York's Twisted Sister, incorporated the theatrics (and sometimes makeup) of glam rock acts like Alice Cooper and Kiss.[175] The lyrics of these glam metal bands characteristically emphasized hedonism and wild behavior and musically were distinguished by rapid-fire shred guitar solos, anthemic choruses, and a relatively melodic, pop-oriented approach.[175] By the mid-1980s bands were beginning to emerge from the L.A. scene that pursued a less glam image and a rawer sound, particularly Guns N' Roses, breaking through with the chart-topping Appetite for Destruction (1987), and Jane's Addiction, who emerged with their major label debut Nothing's Shocking, the following year.[177]
In the late 1980s metal fragmented into several subgenres, including thrash metal, which developed in the US from the style known as speed metal, under the influence of hardcore punk, with low-register guitar riffs typically overlaid by shredding leads.[178] Lyrics often expressed nihilistic views or deal with social issues using visceral, gory language. It was popularised by the "Big Four of Thrash": Metallica, Anthrax, Megadeth, and Slayer.[174] Death metal developed out of thrash, particularly influenced by the bands Venom and Slayer. Florida's Death and the Bay Area's Possessed emphasized lyrical elements of blasphemy, diabolism and millenarianism, with vocals usually delivered as guttural "death growls," high-pitched screaming, complemented by downtuned, highly distorted guitars and extremely fast double bass percussion.[179] Black metal, again influenced by Venom and pioneered by Denmark's Mercyful Fate, Switzerland's Hellhammer and Celtic Frost, and Sweden's Bathory, had many similarities in sound to death metal, but was often intentionally lo-fi in production and placed greater emphasis on satanic and pagan themes.[180][181] Bathory were particularly important in inspiring the further sub-genres of Viking metal and folk metal.[182] Power metal emerged in Europe in the late 1980s as a reaction to the harshness of death and black metal and was established by Germany's Helloween, who combined a melodic approach with thrash's speed and energy.[183] England's DragonForce[184] and Florida's Iced Earth[185] have a sound indebted to NWOBHM, while acts such as Florida's Kamelot, Finland's Nightwish, Italy's Rhapsody of Fire, and Russia's Catharsis feature a keyboard-based "symphonic" sound, sometimes employing orchestras and opera singers. In contrast to other sub-genres doom metal, influenced by Gothic rock, slowed down the music, with bands like England's Pagan Altar and Witchfinder General and the United States' Pentagram, Saint Vitus and Trouble, emphasizing melody, down-tuned guitars, a 'thicker' or 'heavier' sound and a sepulchral mood.[186][187]
[edit]Heartland rock
Main article: Heartland rock


Bruce Springsteen in East Berlin in 1988
American working-class oriented heartland rock, characterized by a straightforward musical style, and a concern with the lives of ordinary, blue collar American people, developed in the second half of the 1970s. The term heartland rock was first used to describe Midwestern arena rock groups like Kansas, REO Speedwagon and Styx, but which came to be associated with a more socially concerned form of roots rock more directly influenced by folk, country and rock and roll.[188] It has been seen as an American Midwest and Rust Belt counterpart to West Coast country rock and the Southern rock of the American South.[189] Led by figures who had initially been identified with punk and New Wave, it was most strongly influenced by acts such as Bob Dylan, The Byrds, Creedence Clearwater Revival and Van Morrison, and the basic rock of '60s garage and the Rolling Stones.[190]
Exemplified by the commercial success of singer songwriters Bruce Springsteen, Bob Seger, and Tom Petty, along with less widely known acts such as Southside Johnny and the Asbury Jukes and Joe Grushecky and the Houserockers, it was partly a reaction to post-industrial urban decline in the East and Mid-West, often dwelling on issues of social disintegration and isolation, beside a form of good-time rock and roll revivalism.[190] The genre reached its commercial, artistic and influential peak in the mid-1980s, with Springsteen's Born in the USA (1984), topping the charts worldwide and spawning a series of top ten singles, together with the arrival of artists including John Mellencamp, Steve Earle and more gentle singer/songwriters such as Bruce Hornsby.[190] It can also be heard as an influence on artists as diverse as Billy Joel,[191] Kid Rock[192] and The Killers.[193]
Heartland rock faded away as a recognized genre by the early 1990s, as rock music in general, and blue collar and white working class themes in particular, lost influence with younger audiences, and as heartland's artists turned to more personal works.[190] Many heartland rock artists continue to record today with critical and commercial success, most notably Bruce Springsteen, Tom Petty and John Mellencamp, although their works have become more personal and experimental and no longer fit easily into a single genre. Newer artists whose music would perhaps have been labelled heartland rock had it been released in the 1970s or 1980s, such as Missouri's Bottle Rockets and Illinois' Uncle Tupelo, often find themselves labeled alt-country.[194]
[edit]The emergence of alternative rock
Main article: Alternative rock
See also: Jangle pop, College rock, Indie pop, Dream pop, and Shoegaze


R.E.M. was a successful alternative rock band in the 1980s
The term alternative rock was coined in the early 1980s to describe rock artists who did not fit into the mainstream genres of the time. Bands dubbed "alternative" had no unified style, but were all seen as distinct from mainstream music. Alternative bands were linked by their collective debt to punk rock, through hardcore, New Wave or the post-punk movements.[195] Important alternative rock bands of the 1980s in the US included R.E.M., Hüsker Dü, Jane's Addiction, Sonic Youth, and the Pixies,[195] and in the UK The Cure, New Order, The Jesus and Mary Chain, and The Smiths.[196] Artists were largely confined to independent record labels, building an extensive underground music scene based on college radio, fanzines, touring, and word-of-mouth.[197] They rejected the dominant synthpop of the early 1980s, marking a return to group-based guitar rock.[198][199][200]
Few of these early bands, with the exceptions of R.E.M. and The Smiths, achieved mainstream success, but despite a lack of spectacular album sales, they exerted a considerable influence on the generation of musicians who came of age in the 1980s and ended up breaking through to mainstream success in the 1990s. Styles of alternative rock in the U.S. during the 1980s included jangle pop, associated with the early recordings of R.E.M., which incorporated the ringing guitars of mid-1960s pop and rock, and college rock, used to describe alternative bands that began in the college circuit and college radio, including acts such as 10,000 Maniacs and The Feelies.[195] In the UK Gothic rock was dominant in the early 1980s, but by the end of the decade indie or dream pop[201] like Primal Scream, Bogshed, Half Man Half Biscuit and The Wedding Present, and what were dubbed shoegaze bands like My Bloody Valentine, Ride, Lush, Chapterhouse, and the Boo Radleys.[202] Particularly vibrant was the Madchester scene, produced such bands as Happy Mondays, the Inspiral Carpets, and Stone Roses.[196][203] The next decade would see the success of grunge in the United States and Britpop in the United Kingdom, bringing alternative rock into the mainstream.
[edit]Alternative goes mainstream (the 1990s)

[edit]Grunge
Main article: Grunge


Nirvana (pictured here in 1992) popularized grunge worldwide.
Disaffected by commercialized and highly produced pop and rock in the mid-1980s, bands in Washington state (particularly in the Seattle area) formed a new style of rock which sharply contrasted with the mainstream music of the time.[204] The developing genre came to be known as "grunge", a term descriptive of the dirty sound of the music and the unkempt appearance of most musicians, who actively rebelled against the over-groomed images of popular artists.[204] Grunge fused elements of hardcore punk and heavy metal into a single sound, and made heavy use of guitar distortion, fuzz and feedback.[204] The lyrics were typically apathetic and angst-filled, and often concerned themes such as social alienation and entrapment, although it was also known for its dark humor and parodies of commercial rock.[204]
Bands such as Green River, Soundgarden, the Melvins and Skin Yard pioneered the genre, with Mudhoney becoming the most successful by the end of the decade. However, grunge remained largely a local phenomenon until 1991, when Nirvana‘s Nevermind became a huge success thanks to the lead single "Smells Like Teen Spirit".[205] Nevermind was more melodic than its predecessors, but the band refused to employ traditional corporate promotion and marketing mechanisms. During 1991 and 1992, other grunge albums such as Pearl Jam's Ten, Soundgarden's Badmotorfinger and Alice in Chains' Dirt, along with the Temple of the Dog album featuring members of Pearl Jam and Soundgarden, became among the 100 top selling albums.[206] The popular breakthrough of these grunge bands prompted Rolling Stone to nickname Seattle "the new Liverpool."[207] Major record labels signed most of the remaining grunge bands in Seattle, while a second influx of acts moved to the city in the hope of success.[208] However, with the death of Kurt Cobain and the subsequent break-up of Nirvana in 1994, touring problems for Pearl Jam and the departure of Alice in Chains' lead singer Layne Staley in 1996, the genre began to decline, partly to be overshadowed by Britpop and more commercial sounding post-grunge.[209]
[edit]Britpop
Main article: Britpop


Oasis performing in 2005
Britpop emerged from the British alternative rock scene of the early 1990s and was characterised by bands particularly influenced by British guitar music of the 1960s and 1970s.[196] The Smiths were a major influence, as were bands of the Madchester scene, which had dissolved in the early 1990s.[51] The movement has been seen partly as a reaction against various U.S. based, musical and cultural trends in the late 1980s and early 1990s, particularly the grunge phenomenon and as a reassertion of a British rock identity.[196] Britpop was varied in style, but often used catchy tunes and hooks, beside lyrics with particularly British concerns and the adoption of the iconography of the 1960s British Invasion, including the symbols of British identity previously utilised by the mods.[210] It was launched around 1992 with releases by groups such as Suede and Blur, who were soon joined by others including Oasis, Pulp, Supergrass and Elastica, who produced a series of top ten albums and singles.[196] For a while the contest between Blur and Oasis was built by the popular press into "The Battle of Britpop", initially won by Blur, but with Oasis achieving greater long-term and international success, directly influencing a third generation of Britpop bands, including The Boo Radleys, Ocean Colour Scene and Cast.[211] Britpop groups brought British alternative rock into the mainstream and formed the backbone of a larger British cultural movement known as Cool Britannia.[212] Although its more popular bands, particularly Blur and Oasis, were able to spread their commercial success overseas, especially to the United States, the movement had largely fallen apart by the end of the decade.[196]
[edit]Post-grunge
Main article: Post-grunge


Foo Fighters performing an acoustic show in 2007
The term post-grunge was coined for the generation of bands that followed the emergence into the mainstream, and subsequent hiatus, of the Seattle grunge bands. Post-grunge bands emulated their attitudes and music, but with a more radio-friendly commercially oriented sound.[209] Often they worked through the major labels and came to incorporate diverse influences from jangle pop, pop-punk, alternative metal or hard rock.[209] The term post-grunge was meant to be pejorative, suggesting that they were simply musically derivative, or a cynical response to an "authentic" rock movement.[213] From 1994, former Nirvana drummer Dave Grohl's new band, the Foo Fighters, helped popularize the genre and define its parameters.[214]
Some post-grunge bands, like Candlebox, were from Seattle, but the sub-genre was marked by a broadening of the geographical base of grunge, with bands like Los Angeles' Audioslave, and Georgia's Collective Soul and beyond the US to Australia's Silverchair and Britain's Bush, who all cemented post-grunge as one of the most commercially viable sub-genres of the late 1990s.[195][209] Although male bands predominated, female solo artist Alanis Morissette's 1995 album Jagged Little Pill, labelled as post-grunge, also became a multi-platinum hit.[215] Bands like Creed and Nickelback took post-grunge into the 21st century with considerable commercial success, abandoning most of the angst and anger of the original movement for more conventional anthems, narratives and romantic songs, and were followed in this vein by new acts including Shinedown, Seether and 3 Doors Down.[213]
[edit]Pop punk
Main article: Pop punk


Green Day performing in 2009
The origins of 1990s pop punk can be seen in the more song-oriented bands of the 1970s punk movement like The Buzzcocks and The Clash, commercially successful New Wave acts such as The Jam and The Undertones, and the more hardcore-influenced elements of alternative rock in the 1980s.[216] Pop-punk tends to use power-pop melodies and chord changes with speedy punk tempos and loud guitars.[217] Punk music provided the inspiration for some California-based bands on independent labels in the early 1990s, including Rancid, Pennywise, Weezer and Green Day.[216] In 1994 Green Day moved to a major label and produced the album Dookie, which found a new, largely teenage, audience and proved a surprise diamond-selling success, leading to a series of hit singles, including two number ones in the US.[195] They were soon followed by the eponymous début from Weezer, which spawned three top ten singles in the US.[218] This success opened the door for the multi-platinum sales of metallic punk band The Offspring with Smash (1994).[195] This first wave of pop punk reached its commercial peak with Green Day's Nimrod (1997) and The Offspring's Americana (1998).[219]
A second wave of pop punk was spearheaded by Blink-182, with their breakthrough album Enema of the State (1999), followed by bands such as Good Charlotte, Bowling for Soup and Sum 41, who made use of humour in their videos and had a more radio-friendly tone to their music, while retaining the speed, some of the attitude and even the look of 1970s punk.[216] Later pop-punk bands, including Simple Plan, The All-American Rejects and Fall Out Boy, had a sound that has been described as closer to 1980s hardcore, while still achieving considerable commercial success.[216]
[edit]Indie rock
Main article: Indie rock
See also: Riot Grrrl, Lo-fi music, Post rock, Math rock, Space rock, Sadcore, and Baroque pop


Lo-fi indie rock band Pavement in 2006
In the 1980s the terms indie rock and alternative rock were used interchangeably.[220] By the mid-1990s, as elements of the movement began to attract mainstream interest, particularly grunge and then Britpop, post-grunge and pop-punk, the term alternative began to lose its meaning.[220] Those bands following the less commercial contours of the scene were increasingly referred to by the label indie.[220] They characteristically attempted to retain control of their careers by releasing albums on their own or small independent labels, while relying on touring, word-of-mouth, and airplay on independent or college radio stations for promotion.[220] Linked by an ethos more than a musical approach, the indie rock movement encompassed a wide range of styles, from hard-edged, grunge-influenced bands like The Cranberries and Superchunk, through do-it-yourself experimental bands like Pavement, to punk-folk singers such as Ani DiFranco.[195][196] It has been noted that indie rock has a relatively high proportion of female artists compared with preceding rock genres, a tendency exemplified by the development of feminist-informed Riot Grrrl music.[221] Many countries have developed an extensive local indie scene, flourishing with bands with enough popularity to survive inside the respective country, but virtually unknown outside them.[222]
By the end of the 1990s many recognisable sub-genres, most with their origins in the late '80s alternative movement, were included under the umbrella of indie. Lo-fi eschewed polished recording techniques for a D.I.Y. ethos and was spearheaded by Beck, Sebadoh and Pavement.[195] The work of Talk Talk and Slint helped inspire both post rock, an experimental style influenced by jazz and electronic music, pioneered by Bark Psychosis and taken up by acts such as Tortoise, Stereolab, and Laika,[223][224] as well as leading to more dense and complex, guitar-based math rock, developed by acts like Polvo and Chavez.[225] Space rock looked back to progressive roots, with drone heavy and minimalist acts like Spaceman 3, the two bands created out of its split, Spectrum and Spiritualized, and later groups including Flying Saucer Attack, Godspeed You Black Emperor! and Quickspace.[226] In contrast, Sadcore emphasised pain and suffering through melodic use of acoustic and electronic instrumentation in the music of bands like American Music Club and Red House Painters,[227] while the revival of Baroque pop reacted against lo-fi and experimental music by placing an emphasis on melody and classical instrumentation, with artists like Arcade Fire, Belle and Sebastian and Rufus Wainright.[228]
[edit]Alternative metal, rap rock and nu metal
Main article: Heavy metal music
See also: Alternative metal, Rap rock, Rap metal, and Nu metal


Linkin Park performing in 2009
Alternative metal emerged from the hardcore scene of alternative rock in the US in the later 1980s, but gained a wider audience after grunge broke into the mainstream in the early 1990s.[229] Early alternative metal bands mixed a wide variety of genres with hardcore and heavy metal sensibilities, with acts like Jane's Addiction and Primus utilizing prog-rock, Soundgarden and Corrosion of Conformity using garage punk, The Jesus Lizard and Helmet mixing noise-rock, Ministry and Nine Inch Nails influenced by industrial music, Monster Magnet moving into psychedelia, Pantera, Sepultura and White Zombie creating groove metal, while Biohazard and Faith No More turned to hip hop and rap.[229]
Hip hop had gained attention from rock acts in the early 1980s, including The Clash with "The Magnificent Seven" (1981) and Blondie with "Rapture" (1981).[230][231] Early crossover acts included Run DMC and the Beastie Boys.[232] Detroit rapper Esham became known for his "acid rap" style, which fused rapping with a sound that was often based in rock and heavy metal.[233][234] Rappers who sampled rock songs included Ice-T, The Fat Boys, LL Cool J, Public Enemy and Whodini.[235] The mixing of thrash metal and rap was pioneered by Anthrax on their 1987 comedy-influenced single "I'm the Man".[235]
In 1990, Faith No More broke into the mainstream with their single "Epic', often seen as the first truly successful combination of heavy metal with rap.[236] This paved the way for the success of existing bands like 24-7 Spyz and Living Colour, and new acts including Rage Against the Machine and Red Hot Chili Peppers, who all fused rock and hip hop among other influences.[213][235] Among the first wave of performers to gain mainstream success as rap rock were 311,[237] Bloodhound Gang,[238] and Kid Rock.[239] A more metallic sound - nu metal - was pursued by bands including Limp Bizkit, Korn and Slipknot.[235] Later in the decade this style, which contained a mix of grunge, punk, metal, rap and turntable scratching, spawned a wave of successful bands like Linkin Park, P.O.D. and Staind, who were often classified as rap metal or nu metal, the first of which are the best-selling band of the genre.[240]
In 2001, nu metal reached its peak with albums like Staind's Break the Cycle, P.O.D's Satellite, Slipknot's Iowa and Linkin Park's Hybrid Theory. New bands also emerged like Disturbed, post-grunge/hard rock band Godsmack and Papa Roach, whose major label début Infest became a platinum hit.[241] However, by 2002 there were signs that nu metal's mainstream popularity was weakening.[213] Korn's long awaited fifth album Untouchables, and Papa Roach's second album Lovehatetragedy, did not sell as well as their previous releases, while nu metal bands were played more infrequently on rock radio stations and MTV began focusing on pop punk and emo.[242] Since then, many bands have changed to a more conventional hard rock or heavy metal music sound.[242]
[edit]Post-Britpop
Main article: Post-Britpop


Coldplay in 2008
From about 1997, as dissatisfaction grew with the concept of Cool Britannia, and Britpop as a movement began to dissolve, emerging bands began to avoid the Britpop label while still producing music derived from it.[243][244] Many of these bands tended to mix elements of British traditional rock (or British trad rock),[245] particularly the Beatles, Rolling Stones and Small Faces,[246] with American influences, including post-grunge.[247][248] Drawn from across the United Kingdom (with several important bands emerging from the north of England, Scotland, Wales and Northern Ireland), the themes of their music tended to be less parochially centred on British, English and London life and more introspective than had been the case with Britpop at its height.[249][250] This, beside a greater willingness to engage with the American press and fans, may have helped some of them in achieving international success.[251]
Post-Britpop bands have been seen as presenting the image of the rock star as an ordinary person and their increasingly melodic music was criticised for being bland or derivative.[252] Post-Britpop bands like The Verve with Urban Hymns (1997), Radiohead from OK Computer (1997), Travis from The Man Who (1999), Stereophonics from Performance and Cocktails (1999), Feeder from Echo Park (2001) and particularly Coldplay from their debut album Parachutes (2000), achieved much wider international success than most of the Britpop groups that had preceded them, and were some of the most commercially successful acts of the late 1990s and early 2000s, arguably providing a launchpad for the subsequent garage rock or post-punk revival, which has also been seen as a reaction to their introspective brand of rock

Enhanced by Zemanta

Monday, October 3, 2011

Extremely Loud And Incredibly Close

Luftangriffe auf DresdenImage via Wikipedia
Extremely Loud and Incredibly Close is a 2005 novel by Jonathan Safran Foer. The book's narrator is a nine-year-old boy named Oskar Schell. Two years before the story begins, Oskar's father dies on 9/11. In the story, Oskar discovers a key in a vase that belonged to his father that inspires him to search all around New York for information about the key.
Contents [hide]
1 Narration
2 Criticism
3 Film adaptation
4 Comparisons to The History of Love
5 See also
6 References
7 External links
[edit]Narration

The main narrator of the story is a nine year old child, Oskar Schell, an intellectually curious and sensitive child of Manhattan progressives. He is a pacifist, a vegan, musical (he plays the tambourine), academically inclined, and above all, earnest. Two additional narrators, Oskar's paternal grandparents, tell the story of their childhood, courtship, marriage, and separation before the birth of Oskar's father; much of their story is presented as a series of letters addressed to Oskar or his father.
[edit]Criticism

Critical response towards Extremely Loud and Incredibly Close has been generally less positive than for Foer's first novel; John Updike, writing for The New Yorker, found the second novel to be: "thinner, overextended, and sentimentally watery", stating that "the book’s hyperactive visual surface covers up a certain hollow monotony in its verbal drama".[1] In a New York Times review Michiko Kakutani said, "While it contains moments of shattering emotion and stunning virtuosity that attest to Mr. Foer's myriad gifts as a writer, the novel as a whole feels simultaneously contrived and improvisatory, schematic and haphazard."[2] Kakutani also stated the book was "cloying" and identified the unsympathetic main character as a major issue. Harry Siegel, writing in New York Press, bluntly titled his review of the book "Extremely Cloying & Incredibly False: Why the author of Everything Is Illuminated is a fraud and a hack", seeing Foer as an opportunist taking advantage of 9/11 "to make things important, to get paid" while also adding "The writers who make it get treated as symbols. Whitehead gets compared to Ellison, because they're both black; Lethem writes a book about race invisibility, but since he's a white boy, no one thinks to mention Ellison. In the same vein, Foer is supposed to be our new Philip Roth, though his fortune-cookie syllogisms and pointless illustrations and typographical tricks don't at all match up to or much resemble Roth even at his most inane. But Jews will be Jews, apparently."[3] Anis Shivani said similarly in a Huffington Post article entitled "The 15 Most Overrated Contemporary American Writers", claiming Foer "Rode the 9/11-novel gravy train with Extremely Loud and Incredibly Close, giving us a nine-year-old with the brain of a twenty-eight-year-old Jonathan Safran Foer".[4]
[edit]Film adaptation

Main article: Extremely Loud and Incredibly Close (film)
A film adaptation of the novel is in production as of April 2011. The script has been written by Eric Roth, with Stephen Daldry directing.[5] Tom Hanks, Sandra Bullock, John Goodman, Viola Davis, and Jeffrey Wright are attached to star,[6] alongside 2010 Jeopardy! Kids Week winner Thomas Horn, 12, as Oskar Schell.[7] The film is being produced by Paramount Pictures and Warner Bros. and set to be released in 2012.
[edit]Comparisons to The History of Love

Extremely Loud and Incredibly Close was published in early 2005 as was The History of Love, written by Nicole Krauss who had just married Foer. Both books feature a precocious youth who set out in New York City on a quest. Both protagonists encounter old men with memories of World War II (a Holocaust survivor in Krauss and a survivor of the Dresden firebombing in Foer). Both old men recently suffered the death of long-lost sons. The stories also use some similar and uncommon literary techniques, such as unconventional typography

Task Rabbit

WASHINGTON - NOVEMBER 09:  U.S. Vice President...Image by Getty Images via @daylife
Who are the TaskRabbits?

TaskRabbits are friendly, awesome people in your community who are either under-employed, retired, parents with grown children, or folks who just want something more interesting than a standard desk job.

See Top TaskRabbits »

TaskRabbits are

Sunday, September 25, 2011

Kieran Culkin

Kieran Culkin at the 2010 Comic Con in San DiegoImage via Wikipedia
Culkin was born in New York City, the son of Patricia Brentrup and Christopher 'Kit' Culkin, a former stage actor with a long career on Broadway.[1]
He has four brothers, Shane Arliss (b. 1976), Macaulay Carson Culkin (b. 1980), Christian Patrick (b. 1987), and Rory Hugh Culkin (b. 1989), and two sisters, Dakota Ulissa (1978–2008) and Quinn Kay (b. 1984).[2]
[edit]Career

Kieran Culkin's first film role was a small part alongside his brother, Macaulay, in Home Alone as cousin Fuller McCallister. He continued acting as a child and teenager, mainly working in comedies, including Home Alone 2: Lost in New York and Father of the Bride and its sequel.
As a teenager, he alternated between lead roles in independent films and small parts in mainstream films. He played the title role in the film Igby Goes Down, for which he was nominated for a Golden Globe Award. He appeared in the Academy Award-nominated movie Music of the Heart as well.
He played Buff in Eric Bogosian's updated version of SubUrbia at the Second Stage Theatre in New York. In 2010, Culkin played Scott Pilgrim's "cool gay roommate" Wallace Wells in the movie Scott Pilgrim vs. the World.
He also had the lead role in The Mighty as Kevin Dillon.
[edit]Filmography



Culkin at the 2008 Toronto International Film Festival
Year Title Role Notes
1990 Home Alone Fuller McCallister
1991 Only the Lonely Patrick Muldoon Jr.
1991 Father of the Bride Matty Banks Nominated — Young Artist Award for Best Young Actor Co-starring in a Motion Picture
1992 Home Alone 2: Lost in New York Fuller McCallister
1993 Nowhere to Run Mike 'Mookie' Anderson
1994 My Summer Story Ralph 'Ralphie' Parker
1995 Father of the Bride Part II Matty Banks
1996 Amanda Biddle Farnsworth
1998 The Mighty Kevin Dillon Nominated — Young Artist Award for Best Performance in a Feature Film - Leading Young Actor
1999 She's All That Simon Boggs
1999 Music of the Heart Lexi at 15
1999 The Cider House Rules Buster Nominated — Screen Actors Guild Award for Outstanding Performance by a Cast in a Motion Picture
1999 The Magical Legend of the Leprechauns Barney O'Grady
2001 Go Fish Andy 'Fish' Troutner Lead role, TV Series
(5 episodes)
2002 The Dangerous Lives of Altar Boys Tim Sullivan
2002 Igby Goes Down Jason "Igby" Slocumb, Jr. Broadcast Film Critics Association Award for Best Young Performer
Las Vegas Film Critics Society Award for Youth in Film
Satellite Award for Best Actor - Motion Picture Musical or Comedy
Nominated — Golden Globe Award for Best Actor – Motion Picture Musical or Comedy
Nominated — MTV Movie Award for Best Breakthrough Performance
2009 Lymelife Jimmy Bartlett
2009 Paper Man Christopher
2009 Three Stories About Joan
2010 Scott Pilgrim vs. the World Wallace Wells Nominated—Detroid Film Critics Society Award for Best Ensemble
2010 The Stanford Prison Experiment
2011 Margaret Paul
2012 The Other Side Rupert Pupkin
[edit]Stage credits

Year Title Role Notes
2000 The Moment When Wilson Playwrights Horizons, New York
2003 This is Our Youth Warren Garrick Theatre, London
2004 After Ashley Justin Hammond Vineyard Theatre, New York/Obie Award for Performance
2007 subUrbia Buff Second Stage Theatre, New York


Enhanced by Zemanta

Saturday, September 24, 2011

Trading Sessions

Now that you know what forex is, why you should trade it, and who makes up the forex market, it's about time you learned when you can trade.

Yes, it is true that the forex market is open 24 hours a day, but that doesn't mean it's always active the whole day.

You can make money trading when the market moves up, and you can even make money when the market moves down.

BUT you will have a very difficult time trying to make money when the market doesn't move at all.

And believe us, there will be times when the market is as still as the victims of Medusa. This lesson will help determine when the best times of the day are to trade.




Market Hours


Before looking at the best times to trade, we must look at what a 24-hour day in the forex world looks like.

The forex market can be broken up into four major trading sessions: the Sydney session, the Tokyo session, the London session, and Pipcrawler's favorite time to trade, the New York session. Below are tables of the open and close times for each session:

Summer
Time Zone EDT GMT
Sydney Open
Sydney Close 6:00 PM
3:00 AM
10:00 PM
7:00 AM
Tokyo Open
Tokyo Close 7:00 PM
4:00 AM 11:00 PM
8:00 AM
London Open
London Close
3:00 AM
12:00 PM

7:00 AM
4:00 PM
New York Open
New York Close 8:00 AM
5:00 PM 12:00 PM
9:00 PM
Winter
Time Zone EST GMT
Sydney Open
Sydney Close 4:00 PM
1:00 AM
9:00 PM
6:00 AM
Tokyo Open
Tokyo Close 6:00 PM
3:00 AM 11:00 PM
8:00 AM
London Open
London Close
3:00 AM
12:00 PM

8:00 AM
5:00 PM
New York Open
New York Close 8:00 AM
5:00 PM 1:00 PM
10:00 PM
You can see that in between each session, there is a period of time where two sessions are open at the same time. From 3:00-4:00 am EDT, the Tokyo session and London session overlap, and from 8:00-12:00 am EDT, the London session and the New York session overlap.

Naturally, these are the busiest times during the trading day because there is more volume when two markets are open at the same time. This makes sense because during those times, all the market participants are wheelin' and dealin', which means that more money is transferring hands.

Now, you're probably looking at the Sydney open and thinking why it shifts two hours. You'd think that Sydney's open would only move one hour when the U.S. adjusts for standard time, but remember that when the U.S. shifts one hour back, Sydney actually moves forward by one hour (seasons are opposite in Australia). You should always remember this if you ever plan to trade during that time period.

Let's take a look at the average pip movement of the major currency pairs during each trading session.
Pair Tokyo London New York
EUR/USD 76 114 92
GBP/USD 92 127 99
USD/JPY 51 66 59
AUD/USD 77 83 81
NZD/USD 62 72 70
USD/CAD 57 96 96
USD/CHF 67 102 83
EUR/JPY 102 129 107
GBP/JPY 118 151 132
AUD/JPY 98 107 103
EUR/GBP 78 61 47
EUR/CHF 79 109 84




Enhanced by Zemanta

Need for Speed: The Run

Image representing Electronic Arts as depicted...Image via CrunchBase
The Need For Speed series has varied over the years, between exciting arcade action (Need For Speed: Hot Pursuit), a closer approach to simulation (Need For Speed: Shift) or an underground racing romp (Need For Speed: Underground and its sequels).  But Electronic Arts will once again be shifting (mind the pun) gears in a new direction later this year when it releases Need For Speed: The Run, the latest from long-time series developer Black Box.  Rather than go with just your typical street racing event, it instead turns to a nationwide tournament, one with dire consequences if you just so happen to place anything other than first.
The game focuses on a guy named Jack.  He’s a down-and-out guy who’s fallen in with the wrong crowds, with both crooks and the cops tailing after him.  He vows to get himself out of trouble, but the only thing he has to his credit is his driving skills, which are more than enough to escape his pursuers – until the next time, at least.  Then, Jack heard about a nationwide tournament called The Run, a “Cannonball Run”-esque contest of sorts (without Burt Reynolds and Dom DeLuise, obviously), where racers go from San Francisco to New York in one straight shot, going as fast as possible.  The first one across the finish line scores a $25 million payday, which would be more than enough to give Jack a new start.





Enhanced by Zemanta

Wednesday, September 21, 2011

Citigroup

Looking east and up at Citigroup Center in Man...Image via Wikipedia
Citigroup Inc. (NYSE: C) or Citi is an American multinational financial services corporation headquartered in Manhattan, New York City, New York, United States. Citigroup was formed from one of the world's largest mergers in history by combining the banking giant Citicorp and financial conglomerate Travelers Group on April 7, 1998.[3]
Citigroup Inc. has the world's largest financial services network in the world, spanning 140 countries with approximately 16,000 offices worldwide. The company currently employs approximately 260,000 staff around the world, which is down from 267,150 in 2010 according to Forbes.[4][5] It also holds over 200 million customer accounts in more than 140 countries. It is a primary dealer in US Treasury securities.[6] According to Forbes, at its height Citigroup used to be the largest company and bank in the world by total assets with 357,000 employees until the global financial crisis of 2008.[7] Today it is ranked 24th in terms of assets size compared to HSBC which now ranks as the largest company and bank by assets in the world as of 2011.[8]
Citigroup suffered huge losses during the global financial crisis of 2008 and was rescued in November 2008 in a massive stimulus package by the U.S. government.[9] Its largest shareholders include funds from the Middle East and Singapore.[10] According to the NYTimes, on February 23, 2009, Citigroup announced that the United States government would take a 36% equity stake in the company by converting $25 billion in emergency aid into common shares with a US Treasury credit line of $45 billion to prevent the bankruptcy of the largest bank in the world at the time. The government would also guarantee losses on more than $300 billion troubled assets and inject $20 billion immediately into the company. In exchange, the salary of the CEO is $1 per year and the highest salary of employees is restricted to $500,000 in cash and any amount above $500,000 must be paid with restricted stock that cannot be sold until the emergency government aid is repaid in full. The US government also gains control of half the seats in the Board of Directors, and the senior management is subjected to removal by the US government if there is poor performance. By December 2009, the US government stake was reduced to 27% majority stake from a 36% majority stake after Citigroup sold $21 billion of common shares and equity in the largest single share sale in US history, surpassing Bank of America's $19 billion share sale one month prior. Eventually by December 2010, Citigroup repaid the emergency aid in full and the US government received an additional $12 billion profit in selling its shares.[11][12][13][14][15] US Government restrictions on pay and oversight of the senior management are removed after the US government sold its remaining 27% stake as of December 2010. According to the WSJ, the government aid was provided to prevent a world-wide chaos and panic by the potential collapse of its Global Transactions Services division, which transports more than $3 trillion around the world each day for most of the Fortune 500 companies and over 80 national governments and 60 central banks around the world. According to the article, Mr. Pandit said if Citigroup was allowed to unravel into bankruptcy, "100 governments around the world would be trying to figure out how to pay their employees."[16][17][18][19][20]
Despite huge losses during the global financial crisis, Citigroup Inc. built up a enormous cash pile in the wake of the financial crisis with $247.6 billion in cash as of Q1 2011.[21] This was a result of selling its special assets placed in Citi Holdings, which were guaranteed from losses by the US Treasury while under federal majority ownership. Additionally, according to the Washington Post a special IRS tax exception given to Citi to allow the US Treasury to sell its shares at a profit while it still owned Citigroup shares, which eventually net $12 billion dollars. According to Treasury spokeswoman Nayyera Haq, "This (IRS tax) rule was designed to stop corporate raiders from using loss corporations to evade taxes, and was never intended to address the unprecedented situation where the government owned shares in banks. And it was certainly not written to prevent the government from selling its shares for a profit."[22]
Citigroup is one of the Big Four banks in the United States, along with Bank of America, JP Morgan Chase and Wells Fargo.[23][24][25][26][27][28][29]
Contents [hide]
1 History
1.1 Citicorp
1.2 Travelers Group
1.3 Citicorp and Travelers merger
1.4 Travelers spin off
1.5 Subprime mortgage crisis
1.6 Federal assistance
1.7 Return to profitability, non-governmental shareholder ownership
2 Organization
2.1 Citicorp
2.2 Citi Holdings
3 Divisions
3.1 Global Consumer Group
3.2 Global Wealth Management
3.3 Citi Institutional Clients Group
4 Brands
4.1 Citi
4.2 Citibank
4.3 One Main Financial
4.4 CitiMortgage
4.5 Citi Capital Advisors
4.6 Citi Cards
4.7 Citi Private Bank
4.8 Citi Institutional Clients Group
4.9 Citi Investment Research
4.10 Citi Microfinance
4.11 Banamex
4.12 Woman & Co.
5 Real estate
6 Criticism
6.1 Raul Salinas and alleged money laundering
6.2 Conflicts of interest on investment research
6.3 Plutonomy memo
6.4 Enron, WorldCom and Global Crossing bankruptcies
6.5 Citigroup proprietary government bond trading scandal
6.6 2005 "Revisiting Plutonomy: The Rich Getting Richer" equity strategy public investment advisory
6.7 Regulatory action
6.8 Terra Securities scandal
6.9 Theft from customer accounts
6.10 Federal rescue 2008
6.11 Terra Firma Investments lawsuit
7 Public and government relations
7.1 Political donations
7.2 Lobbying and political advice
7.3 Public and governmental relations
8 Notes
9 References
10 External links
[edit]History

Citigroup was formed on October 9, 1998, following the $140 billion merger of Citicorp and Travelers Group to create the world's largest financial services organization.[3] The history of the company is, thus, divided into the workings of several firms that over time amalgamated into Citicorp, a multinational banking corporation operating in more than 100 countries; or Travelers Group, whose businesses covered credit services, consumer finance, brokerage, and insurance. As such, the company history dates back to the founding of: the City Bank of New York (later Citibank) in 1812; Bank Handlowy in 1870; Smith Barney in 1873, Banamex in 1884; Salomon Brothers in 1910.[30]
[edit]Citicorp
The history begins with the City Bank of New York, which was chartered by New York State on June 16, 1812, with $2 million of capital. Serving a group of New York merchants, the bank opened for business on September 14 of that year, and Samuel Osgood was elected as the first President of the company.[31] The company's name was changed to The National City Bank of New York in 1865 after it joined the new U.S. national banking system, and it became the largest American bank by 1895.[31] It became the first contributor to the Federal Reserve Bank of New York in 1913, and the following year it inaugurated the first overseas branch of a U.S. bank in Buenos Aires, although the bank had, since the mid-nineteenth century, been active in plantation economies, such as the Cuban sugar industry. The 1918 purchase of U.S. overseas bank International Banking Corporation helped it become the first American bank to surpass $1 billion in assets, and it became the largest commercial bank in the world in 1929.[31] As it grew, the bank became a leading innovator in financial services, becoming the first major U.S. bank to offer compound interest on savings (1921); unsecured personal loans (1928); customer checking accounts (1936) and the negotiable certificate of deposit (1961).[31]
The bank changed its name to The First National City Bank of New York in 1955, which was shortened in 1962 to First National City Bank on the 150th anniversary of the company's foundation.[31] The company organically entered the leasing and credit card sectors, and its introduction of US$ certificates of deposit in London marked the first new negotiable instrument in market since 1888. Later to become MasterCard, the bank introduced its First National City Charge Service credit card – popularly known as the "Everything card" – in 1967.[31]
In 1976, under the leadership of CEO Walter B. Wriston, First National City Bank (and its holding company First National City Corporation) was renamed as Citibank, N.A. (and Citicorp, respectively). Shortly afterward, the bank launched the Citicard, which pioneered the use of 24-hour ATMs.[31] As the bank's expansion continued, the Narre Warren-Caroline Springs credit card company was purchased in 1981. John S. Reed was elected CEO in 1984, and Citi became a founding member of the CHAPS clearing house in London. Under his leadership, the next 14 years would see Citibank become the largest bank in the United States, the largest issuer of credit cards and charge cards in the world, and expand its global reach to over 90 countries.[31]
[edit]Travelers Group
Travelers Group, at the time of merger, was a diverse group of financial concerns that had been brought together under CEO Sandy Weill. Its roots came from Commercial Credit, a subsidiary of Control Data Corporation that was taken private by Weill in November 1986 after taking charge of the company earlier that year.[3][32] Two years later, Weill mastered the buyout of Primerica – a conglomerate that had already bought life insurer A L Williams as well as stock broker Smith Barney. The new company took the Primerica name, and employed a "cross-selling" strategy such that each of the entities within the parent company aimed to sell each other's services. Its non-financial businesses were spun-off.[32]


The corporate logo of Travelers Inc. (1993–1998) prior to merger with Citicorp.
In September 1992, Travelers Insurance, which had suffered from poor real estate investments[3] and sustained significant losses in the aftermath of Hurricane Andrew,[33] formed a strategic alliance with Primerica that would lead to its amalgamation into a single company in December 1993. With the acquisition, the group became Travelers Inc. Property & casualty and life & annuities underwriting capabilities were added to the business.[32] Meanwhile, the distinctive Travelers red umbrella logo, which was also acquired in the deal, was applied to all the businesses within the newly named organization. During this period, Travelers acquired Shearson Lehman – a retail brokerage and asset management firm that was headed by Weill until 1985[3] – and merged it with Smith Barney.[32]
[edit]Salomon Brothers
Finally, in November 1997, Travelers Group (which had been renamed again in April 1995 when they merged with Aetna Property and Casualty, Inc.), made the $9 billion deal to purchase Salomon Brothers, a major bond dealer and bulge bracket investment bank.[32] This deal complemented Travelers/Smith Barney well as Salomon was focused on fixed-income and institutional clients whereas Smith Barney was strong in equities and retail. Salomon Brothers absorbed Smith Barney into the new securities unit termed Salomon Smith Barney; a year later, the division incorporated Citicorp's former securities operations as well. The Salomon Smith Barney name was ultimately abandoned in October 2003 after a series of financial scandals that tarnished the bank's reputation.
[edit]Citicorp and Travelers merger
On April 6, 1998, the merger between Citicorp and Travelers Group was announced to the world, creating a $140 billion firm with assets of almost $700 billion.[3] The deal would enable Travelers to market mutual funds and insurance to Citicorp's retail customers while giving the banking divisions access to an expanded client base of investors and insurance buyers.
Although presented as a merger, the deal was actually more like a stock swap, with Travelers Group purchasing the entirety of Citicorp shares for $70 billion, and issuing 2.5 new Citigroup shares for each Citicorp share. Through this mechanism, existing shareholders of each company owned about half of the new firm.[3] While the new company maintained Citicorp's "Citi" brand in its name, it adopted Travelers' distinctive "red umbrella" as the new corporate logo, which was used until 2007.
The chairmen of both parent companies, John Reed and Sandy Weill respectively, were announced as co-chairmen and co-CEOs of the new company, Citigroup, Inc., although the vast difference in management styles between the two immediately presented question marks over the wisdom of such a setup.
The remaining provisions of the Glass–Steagall Act – enacted following the Great Depression – forbade banks to merge with insurance underwriters, and meant Citigroup had between two and five years to divest any prohibited assets. However, Weill stated at the time of the merger that they believed "that over that time the legislation will change...we have had enough discussions to believe this will not be a problem".[3] Indeed, the passing of the Gramm-Leach-Bliley Act in November 1999 vindicated Reed and Weill's views, opening the door to financial services conglomerates offering a mix of commercial banking, investment banking, insurance underwriting and brokerage.[34]
Joe Plumeri headed the integration of the consumer businesses of Travelers Group and Citicorp after the merger, and was appointed CEO of Citibank North America by Weill and Reed.[35][36] He oversaw its network of 450 retail branches.[36][37][38] J. Paul Newsome, an analyst with CIBC Oppenheimer, said: "He's not the spit-and-polish executive many people expected. He's rough on the edges. But Citibank knows the bank as an institution is in trouble-it can't get away anymore with passive selling-and Plumeri has all the passion to throw a glass of cold water on the bank."[39] It was conjectured that he might become a leading contender to run all of Citigroup when Weill and Reed stepped down, if he were to effect a big, noticeable victory at Citibank.[39] In that position, Plumeri boosted the unit's earnings from $108 million to $415 million in one year, an increase of nearly 400%.[40][41][42] He unexpectedly retired from Citibank, however, in January 2000.[43][44]
In 2000, Citigroup acquired Associates First Capital Corporation, which, until 1989, had been owned by Gulf+Western (now part of National Amusements). The Associates was widely criticized for predatory lending practices and Citi eventually settled with the Federal Trade Commission by agreeing to pay $240 million to customers who had been victims of a variety of predatory practices, including "flipping" mortgages, "packing" mortgages with optional credit insurance, and deceptive marketing practices.[45]
[edit]Travelers spin off


The current logo for Travelers Companies
The company spun off its Travelers Property and Casualty insurance underwriting business in 2002. The spin off was prompted by the insurance unit's drag on Citigroup stock price because Traveler's earnings were more seasonal and vulnerable to large disasters, particularly the September 11, 2001 attacks on the World Trade Center in downtown New York City. It was also difficult to sell this kind of insurance directly to customers since most industrial customers are accustomed to purchasing insurance through a broker.
The Travelers Property Casualty Corporation merged with The St. Paul Companies Inc. in 2004 forming The St. Paul Travelers Companies. Citigroup retained the life insurance and annuities underwriting business; however, it sold those businesses to MetLife in 2005. Citigroup still heavily sells all forms of insurance, but it no longer underwrites insurance.
In spite of their divesting Travelers Insurance, Citigroup retained Travelers' signature red umbrella logo as its own until February 2007, when Citigroup agreed to sell the logo back to St. Paul Travelers,[46] which renamed itself Travelers Companies. Citigroup also decided to adopt the corporate brand "Citi" for itself and virtually all its subsidiaries, except Primerica and Banamex.[46]
[edit]Subprime mortgage crisis
Heavy exposure to troubled mortgages in the form of Collateralized debt obligation (CDO's), compounded by poor risk management led Citigroup into trouble as the subprime mortgage crisis worsened in 2008. The company had used elaborate mathematical risk models which looked at mortgages in particular geographical areas, but never included the possibility of a national housing downturn, or the prospect that millions of mortgage holders would default on their mortgages. Indeed, trading head Thomas Maheras was close friends with senior risk officer David Bushnell, which undermined risk oversight.[47][48] As Treasury Secretary, Robert Rubin was said to be influential in lifting the regulations that allowed Travelers and Citicorp to merge in 1998. Then on the board of directors of Citigroup, Rubin and Charles Prince were said to be influential in pushing the company towards MBS and CDOs in the subprime mortgage market.
As the crisis began to unfold, Citigroup announced on April 11, 2007, that it would eliminate 17,000 jobs, or about 5 percent of its workforce, in a broad restructuring designed to cut costs and bolster its long underperforming stock.[49] Even after securities and brokerage firm Bear Stearns ran into serious trouble in summer 2007, Citigroup decided the possibility of trouble with its CDO's was so tiny (less than 1/100 of 1%) that they excluded them from their risk analysis. With the crisis worsening, Citigroup announced on January 7, 2008 that it was considering cutting another 5 percent to 10 percent of its work force, which totaled 327,000.[50]
[edit]Federal assistance
Over the past several decades, the United States government has engineered at least four different rescues of the institution now known as Citigroup.[51] During the most recent tax-payer funded rescue, by November 2008, Citigroup was insolvent, despite its receipt of $25 billion in federal TARP funds, and on November 17, 2008, Citigroup announced plans for about 52,000 new job cuts, on top of 23,000 cuts already made during 2008 in a huge job cull resulting from four quarters of consecutive losses and reports that it was unlikely to be in profit again before 2010. On the same day, Wall Street responded by dropping its stock market value to $6 billion, down from $300 billion two years prior.[52] As a result, Citigroup and Federal regulators negotiated a plan to stabilize the company and forestall a further deterioration in the company's value. The arrangement calls for the government to back about $306 billion in loans and securities and directly invest about $20 billion in the company. The assets remain on Citigroup's balance sheet; the technical term for this arrangement is ring fencing. In a New York Times op-ed, Michael Lewis And David Einhorn described the $306 billion guarantee as "an undisguised gift" without any real crisis motivating it.[53] The plan was approved late in the evening on November 23, 2008.[9] A joint statement by the US Treasury Department, the Federal Reserve and the Federal Deposit Insurance Corp announced: "With these transactions, the U.S. government is taking the actions necessary to strengthen the financial system and protect U.S. taxpayers and the U.S. economy."
Citigroup in late 2008 held $20 billion of mortgage-linked securities, most of which have been marked down to between 21 cents and 41 cents on the dollar, and has billions of dollars of buyout and corporate loans. It faces potential massive losses on auto, mortgage and credit card loans if the economy worsens.[citation needed] [This paragraph requires a reference, particularly to the $20 billion figure quoted above. It is likely that this number is a severe underestimate of the value of CDO holdings held in off-balance sheet SIVs.]
On January 16, 2009, Citigroup announced its intention to reorganize itself into two operating units: Citicorp for its retail and institutional client business, and Citi Holdings for its brokerage and asset management.[54] Citigroup will continue to operate as a single company for the time being, but Citi Holdings managers will be tasked to "tak[e] advantage of value-enhancing disposition and combination opportunities as they emerge",[54] and eventual spin-offs or mergers involving either operating unit have not been ruled out.[55] On February 27, 2009 Citigroup announced that the United States government would be taking a 36% equity stake in the company by converting $25 billion in emergency aid into common shares. Citigroup shares dropped 40% on the news.
On June 1, 2009, it was announced that Citigroup Inc. would be removed from the Dow Jones Industrial Average effective June 8, 2009, due to significant government ownership. Citigroup Inc. was replaced by Travelers Co.[56]
[edit]Return to profitability, non-governmental shareholder ownership
In 2010, Citigroup achieved its first profitable year since 2007. It reported $10.6 billion in net profit, compared with a $1.6 billion loss in 2009.[57] Late in 2010, the government sold its remaining stock holding in the company, yielding an overall net profit to taxpayers of $12 billion.[58]
[edit]Organization

Citi is organized into two major segments – Citicorp and Citi Holdings.[59]
[edit]Citicorp
[edit]Regional Consumer Banking
Retail Banking, Local Commercial Banking and Citi Personal Wealth Management
North America, EMEA, Latin America and Asia; Residential real estate in North America
Citi-Branded Cards
North America, EMEA, Latin America and Asia
Latin America Asset Management
[edit]Institutional Clients Group
Securities and Banking
Investment banking
Debt and equity markets (including prime brokerage)
Lending
Private equity
Hedge funds
Real estate
Structured products
Private Bank
Equity and Fixed Income research
Transaction Services
Cash management
Trade services
Custody and fund services
Clearing services
Agency/trust
[edit]Citi Holdings
[edit]Brokerage and Asset Management
Largely includes investment in and associated earnings from Morgan Stanley Smith Barney joint venture
Retail alternative investments
[edit]Local Consumer Lending
North America
Consumer finance lending: residential and commercial real estate; auto, student and personal loans; and consumer branch lending
Retail partner cards
Certain international consumer lending (including Western Europe retail banking and cards)
[edit]Special Asset Pool
Certain institutional and consumer bank portfolios

Enhanced by Zemanta

IBM

Image representing IBM as depicted in CrunchBaseImage via CrunchBase
International Business Machines (IBM) (NYSE: IBM) is an American multinational technology and consulting firm headquartered in Armonk, New York. IBM manufactures and sells computer hardware and software, and it offers infrastructure, hosting and consulting services in areas ranging from mainframe computers to nanotechnology.[2]
The company was founded in 1911 as the Computing Tabulating Recording Corporation through a merger of four companies: the Tabulating Machine Company, the International Time Recording Company, the Computing Scale Corporation, and the Bundy Manufacturing Company.[3][4] CTR adopted the name International Business Machines in 1924, using a name previously designated to CTR's subsidiary in Canada and later South America. Its distinctive culture and product branding has given it the nickname Big Blue.
In 2011, Fortune ranked IBM the 18th largest firm in the U.S.,[5] as well as the 7th most profitable.[6] Globally, the company was ranked the 31st largest firm by Forbes for 2011.[7][8] Other rankings for 2011 include #1 company for leaders (Fortune), #2 best global brand (Interbrand), #1 green company worldwide (Newsweek), #12 most admired company (Fortune), and #18 most innovative company (Fast Company).[9] IBM employs more than 425,000 employees (sometimes referred to as "IBMers") in over 200 countries, with occupations including scientists, engineers, consultants, and sales professionals.[10]
IBM holds more patents than any other U.S.-based technology company and has nine research laboratories worldwide.[11] Its employees have garnered five Nobel Prizes, four Turing Awards, nine National Medals of Technology, and five National Medals of Science.[12] Famous inventions by IBM include the automated teller machine (ATM), the floppy disk, the hard disk drive, the magnetic stripe card, the relational database, the Universal Product Code (UPC), the financial swap, SABRE airline reservation system, DRAM, and Watson artificial intelligence.
The company has undergone several organizational changes since its inception, acquiring companies like SPSS (2009) and PwC consulting (2002), spinning off companies like Lexmark (1991), and selling off product lines like ThinkPad to Lenovo (2005).
Contents [hide]
1 History
1.1 1880-1929
1.2 1930-1979
1.3 1980-present
2 Corporate affairs
2.1 Corporate recognition and brand
2.2 Working at IBM
3 Research and inventions
4 Selected current projects
5 Environmental record
6 Company logo and nickname
7 See also
8 References
9 Further reading
10 External links
[edit]History

Main article: History of IBM
[edit]1880-1929

"THINK"

Thomas J. Watson, who led IBM from 1914-1956, discussing the company's motto "THINK"
Problems listening to this file? See media help.
Starting in the 1880s, various technologies came into existence that would form part of IBM's predecessor company. Julius E. Pitrap patented the computing scale in 1885;[13] Alexander Dey invented the dial recorder (1888);[14] in 1889, Herman Hollerith patented the Electric Tabulating Machine[15] and Willard Bundy invented a time clock to record a worker's arrival and departure time on a paper tape.[16] On June 16, 1911, these technologies and their respective companies were merged by Charles Ranlett Flint to form the Computing-Tabulating-Recording Company (C-T-R).[17] The New York City-based company had 1,300 employees and offices and plants in Endicott and Binghamton, New York; Dayton, Ohio; Detroit, Michigan; Washington, D.C.; and Toronto, Ontario. It manufactured and sold machinery ranging from commercial scales and industrial time recorders to meat and cheese slicers, along with tabulators and punched cards.
Flint recruited Thomas J. Watson, Sr., from the National Cash Register Company to help lead the company in 1914.[17] Watson implemented "generous sales incentives, a focus on customer service, an insistence on well-groomed, dark-suited salesmen and an evangelical fervor for instilling company pride and loyalty in every worker".[18] His favorite slogan, "THINK," became a mantra for C-T-R's employees, and within 11 months of joining C-T-R, Watson became its president.[18] The company focused on providing large-scale, custom-built tabulating solutions for businesses, leaving the market for small office products to others. During Watson's first four years, revenues more than doubled to $9 million and the company's operations expanded to Europe, South America, Asia, and Australia.[18] On February 14, 1924, C-T-R was renamed the International Business Machines Corporation (IBM),[9] citing the need to align its name with the "growth and extension of [its] activities".[19]
[edit]1930-1979


NACA researchers using a IBM type 704 electronic data processing machine in 1957
In 1937, IBM's tabulating equipment enabled organizations to process unprecedented amounts of data, its clients including the U.S. Government, during its first effort to maintain the employment records for 26 million people pursuant to the Social Security Act,[20] and the Third Reich[21], largely through the German subsidiary Dehomag. Also in 1937, the company president met with Adolf Hitler, and discussed issues on the supply of equipment, and in 1941 were made ​​leasing supplies to camps to accommodate the prisoners. During the Second World War the company produced small arms (M1 Carbine, and Browning Automatic Rifle).
In 1952, Thomas J. Watson, Jr., became president of the company, ending almost 40 years of leadership by his father. In 1956, Arthur L. Samuel of IBM's Poughkeepsie, New York, laboratory programmed an IBM 704 to play checkers using a method in which the machine can "learn" from its own experience. It is believed to be the first "self-learning" program, a demonstration of the concept of artificial intelligence. In 1957, IBM developed the FORTRAN (FORmula TRANslation) scientific programming language. In 1961, Thomas J. Watson, Jr., was elected chairman of the board and Albert L. Williams became president of the company. IBM develops the SABRE (Semi-Automatic Business-Related Environment) reservation system for American Airlines. The IBM Selectric typewriter was a highly successful model line of electric typewriters introduced by IBM on July 31, 1961.
In 1963, IBM employees and computers helped NASA track the orbital flight of the Mercury astronauts, and a year later, the company moved its corporate headquarters from New York City to Armonk, New York. The latter half of that decade saw IBM continue its support of space exploration, with IBM participating in the 1965 Gemini flights, the 1966 Saturn flights, and the 1969 mission to land a man on the moon.
On April 7, 1964 IBM announced the first computer system family, the IBM System/360. Sold between 1964 and 1978, it was the first family of computers designed to cover the complete range of applications, from small to large, both commercial and scientific. For the first time, companies could upgrade their computing capabilities with a new model without rewriting their applications.
In 1973, IBM engineer George J. Laurer developed the Universal Product Code.[22]


IBM's Blue Gene supercomputers were awarded the National Medal of Technology and Innovation by U.S. President Barack Obama on September 18, 2009.
[edit]1980-present
Financial swaps were first introduced to the public in 1981 when IBM and the World Bank entered into a swap agreement.[23] The IBM PC was introduced in 1981, originally designated IBM 5150. The IBM PC became the industry standard. In 1991, IBM sold Lexmark, and in 2002, it acquired PwC consulting. In 2003, IBM initiated a project to rewrite its company values. Using its Jam technology, the company hosted Internet-based online discussions on key business issues with 50,000 employees over 3 days. The discussions were analyzed by sophisticated text analysis software (eClassifier) to mine online comments for themes. As a result of the 2003 Jam, the company values were updated to reflect three modern business, marketplace and employee views: "Dedication to every client's success", "Innovation that matters - for our company and for the world", "Trust and personal responsibility in all relationships".[24] In 2004, another Jam was conducted during which 52,000 employees exchanged best practices for 72 hours. They focused on finding actionable ideas to support implementation of the values previously identified.[25]
In 2005 the company sold its personal computer business to Lenovo, and in 2009, it acquired software company SPSS Inc. Later in 2009, IBM's Blue Gene supercomputing program was awarded the National Medal of Technology and Innovation by U.S. President Barack Obama.
In 2011, IBM gained worldwide attention for its artificial intelligence program Watson, which was exhibited on Jeopardy! where it won against game show champions Ken Jennings and Brad Rutter.
[edit]Corporate affairs

IBM's headquarter complex is located in Armonk, Town of North Castle, New York, United States.[26][27][28] The 283,000-square-foot (26,300 m2) IBM building has three levels of custom curtainwall. The building is located on a 25 acre site.[29] IBM has been headquartered in Armonk since 1964.[citation needed]
The company has nine research labs worldwide—Almaden, Austin, Brazil, China, Haifa, India, Tokyo, Watson (New York), and Zurich—with Watson (dedicated in 1961) serving as headquarters for the research division and the site of its annual meeting. Other campus installations include towers in Montreal, Paris, and Atlanta; software labs in Raleigh-Durham, Rome and Toronto; buildings in Chicago, Johannesburg, and Seattle; and facilities in Hakozaki and Yamato. The company also operates the IBM Scientific Center, the Hursley House, the Canada Head Office Building, IBM Rochester, and the Somers Office Complex. The company's contributions to architecture and design, including Chicago's 330 North Wabash building designed by Ludwig Mies van der Rohe, were recognized with the 1990 Honor Award from the National Building Museum.[30]
IBM's Board of Directors, with 14 members, is responsible for the overall management of the company. With Cathie Black's resignation from the board in November 2010, the remaining 13 members (along with their affiliation and year of joining the board) are as follows: Alain J. P. Belda '08 (Alcoa), William R. Brody '07 (Salk Institute / Johns Hopkins University), Kenneth Chenault '98 (American Express), Michael L. Eskew '05 (UPS), Shirley Ann Jackson '05 (Rensselaer Polytechnic Institute), Andrew N. Liveris '10 (Dow Chemical), W. James McNerney, Jr. '09 (Boeing), James W. Owens '06 (Caterpillar), Samuel J. Palmisano '00 (IBM), Joan Spero '04 (Doris Duke Charitable Foundation), Sidney Taurel '01 (Eli Lilly), and Lorenzo Zambrano '03 (Cemex).[31]
Various IBM facilities

IBM Rochester (Minnesota), nicknamed the "Big Blue Zoo"

IBM Avenida de América Building in Madrid, Spain

Somers (New York) Office Complex, designed by I.M. Pei

IBM Japan Makuhari Technical Center, designed by Yoshio Taniguchi

IBM Haifa Research Lab, Haifa, Israel

IBM Kolkata Building, Kolkata, India
[edit]Corporate recognition and brand
In 2011, Fortune ranked IBM the 18th largest firm in the U.S.,[5] as well as the 7th most profitable.[6] Globally, the company was ranked the 31st largest firm by Forbes for 2011.[32] Other rankings for 2011 include the following:[9]
#1 company for leaders (Fortune)
#2 best global brand (Interbrand)
#1 green company worldwide (Newsweek)[33]
#12 most admired company (Fortune)
#18 most innovative company (Fast Company).
For 2010, IBM's brand was valued at $64.7 billion.[34]
[edit]Working at IBM
In 2010, IBM employed 105,000 workers in the U.S., a drop of 30,000 since 2003, and 75,000 people in India, up from 9,000 seven years previous.[35]
IBM's employee management practices can be traced back to its roots. In 1914, CEO Thomas J. Watson boosted company spirit by creating employee sports teams, hosting family outings, and furnishing a company band. In 1924, the Quarter Century Club, which recognizes employees with 25 years of service, was organized and the first issue of Business Machines, IBM's internal publication, was published. In 1925, the first meeting of the Hundred Percent Club, composed of IBM salesmen who meet their quotas, convened in Atlantic City, New Jersey.
IBM was among the first corporations to provide group life insurance (1934), survivor benefits (1935) and paid vacations (1937). In 1932 IBM created an Education Department to oversee training for employees, which oversaw the completion of the IBM Schoolhouse at Endicott in 1933. In 1935, the employee magazine Think was created. Also that year, IBM held its first training class for women systems service professionals. In 1942, IBM launched a program to train and employ disabled people in Topeka, Kansas. The next year classes begin in New York City, and soon the company was asked to join the President's Committee for Employment of the Handicapped. In 1946, the company hired its first black salesman, 18 years before the Civil Rights Act of 1964. In 1947, IBM announces a Total and Permanent Disability Income Plan for employees. A vested rights pension is added to the IBM retirement plan.
In 1952, Thomas J. Watson, Jr., published the company's first written equal opportunity policy letter, one year before the U.S. Supreme Court decision in Brown vs. Board of Education and 11 years before the Civil Rights Act of 1964. In 1961, IBM's nondiscrimination policy was expanded to include sex, national origin, and age. The following year, IBM hosted its first Invention Award Dinner honoring 34 outstanding IBM inventors; and in 1963, the company named the first eight IBM Fellows in a new Fellowship Program that recognizes senior IBM scientists, engineers and other professionals for outstanding technical achievements.


An IBM delivery tricycle in Johannesburg, South Africa in 1965
On September 21, 1953, Thomas Watson, Jr., the company's president at the time, sent out a controversial letter to all IBM employees stating that IBM needed to hire the best people, regardless of their race, ethnic origin, or gender. He also publicized the policy so that in his negotiations to build new manufacturing plants with the governors of two states in the U.S. South, he could be clear that IBM would not build "separate-but-equal" workplaces.[36] In 1984, IBM added sexual orientation to its nondiscrimination policy. The company stated that this would give IBM a competitive advantage because IBM would then be able to hire talented people its competitors would turn down.[37]
IBM was the only technology company ranked in Working Mother magazine's Top 10 for 2004, and one of two technology companies in 2005.[38][39] On October 10, 2005, IBM became the first major company in the world to commit formally to not using genetic information in employment decisions. The announcement was made shortly after IBM began working with the National Geographic Society on its Genographic Project.
IBM provides same-sex partners of its employees with health benefits and provides an anti-discrimination clause. The Human Rights Campaign has consistently rated IBM 100% on its index of gay-friendliness since 2003 (in 2002, the year it began compiling its report on major companies, IBM scored 86%).[40] In 2007 and again in 2010, IBM UK was ranked first in Stonewall's annual Workplace Equality Index for UK employers.[41]
The company has traditionally resisted labor union organizing,[42] although unions represent some IBM workers outside the United States. In 2009, the Unite union stated that several hundred employees joined following the announcement in the UK of pension cuts that left many employees facing a shortfall in projected pensions.[43]
A dark (or gray) suit, white shirt, and a "sincere" tie[44] was the public uniform for IBM employees for most of the 20th century. During IBM's management transformation in the 1990s, CEO Louis V. Gerstner, Jr. relaxed these codes, normalizing the dress and behavior of IBM employees to resemble their counterparts in other large technology companies. Since then IBM's dress code is business casual although employees often wear formal clothes during client meetings.[citation needed]
On 16 June 2011, the company announced a grants programs, called IBM100, to fund its employees participation in volunteer projects - the year long initiative is part of the company's centenary celebrations.[45]
[edit]Research and inventions



An anechoic chamber inside IBM's Yamato research facility
In 1945, The Watson Scientific Computing Laboratory was founded at Columbia University in New York, New York. The renovated fraternity house on Manhattan's West Side was used as IBM's first laboratory devoted to pure science. The lab was the forerunner of IBM's Research Division, which today operates research facilities around the world.
In 1966, IBM researcher Robert H. Dennard invented Dynamic Random Access Memory (DRAM) cells, one-transistor memory cells that store each single bit of information as an electrical charge in an electronic circuit. The technology permits major increases in memory density, and is widely adopted throughout the industry where it remains in widespread use today.
IBM has been a leading proponent of the Open Source Initiative, and began supporting Linux in 1998.[46] The company invests billions of dollars in services and software based on Linux through the IBM Linux Technology Center, which includes over 300 Linux kernel developers.[47] IBM has also released code under different open source licenses, such as the platform-independent software framework Eclipse (worth approximately US$40 million at the time of the donation),[48] the three-sentence International Components for Unicode (ICU) license, and the Java-based relational database management system (RDBMS) Apache Derby. IBM's open source involvement has not been trouble-free, however (see SCO v. IBM

Enhanced by Zemanta