On a recent episode of the reality show Keeping Up with the Kardashians, Kim Kardashian West was told by her mother Kris Jenner that they had to “make do” (with Kim’s early bookings) because Kim couldn’t sing or dance. The undertone was: “We made the most out of the assets and talents you did have.”
This little scene would lend credence to the rumors that Kris, a talent manager and lawyer by trade, purposefully released the sex tape that launched Kim’s career of fame for being famous. But that particular conspiracy theory isn’t as important as what the Kardashian Klan has grown, all based on sex and beauty. Millions, people. $$$$$$$$s. (<-that’s pronounced mee-YO-nezz)
Wait, scratch that. It’s probably Bee-YO-nezz by now.
Lots of hate arises, of course. Any woman who uses any sexual image to earn money is immediately loathed. But, come to think of it, many female writers are loathed on the regular. Dorothy Parker and Virginia Woolf used their wits, yet were still hated on. This is a no-win endeavor.
This topic grew into conversations with some friends online. It’s a sore point for many. PhDs are needed to sort out all the implications. I’m trying to figure it out myself. Traditionally I haven’t used my body’s image to “sell” anything.
Lately I’ve been wondering if I’m doing it wrong.
This isn’t to say I’ll be posting nudes any time soon. I’m not interested in that. But, I am asking myself, why am I not turning the camera toward my own physical vessel? What am I saying by intentionally editing out all images of my body/parts from the internet?
Another thought: Nancy Pelosi once mentioned that all the hubbub around her clothes didn’t bother her at all. She said that if her clothes are what get young girls interested in politics, then so be it. There’s some kind of wisdom in there… I’m not sure I know what it is.
Wait, scratch that. I know what it is. It’s Kris Jenner’s “making do.” I’m simply afraid of the backlash I’ll get for adopting the sentiment. So, this, too, is a no-win endeavor. Perhaps the same exact one.
My friend is a well-known blogger who often posts her smiling, BEAUTIFUL face on social media. Her eyes, especially, are to die for, her skin running a close second. I enjoy seeing her in my streams. Granted, she’s not posting sexy, provocative pics (she could definitely leverage that cleavage), but she’s posting selfies. Her friend is an artist who shoots nude self-portraits in abandoned buildings, resulting in art that conveys multiple layers of devastating meaning.
If my Instagram selfies get readers interested in my work, then are the images are serving a (good) purpose? I must balance a fine line, of course. After all, I’m a professional WRITER, not model. I’m not doing MILF nudies for a living (don’t suggest it – I’ve already heard it more than once… NOT THAT THERE’S ANYTHING WRONG WITH MILF PRON). Too much sexiness may interrupt the image I’m trying to cultivate with my presence online: capable writer, industrious thinker, keen strategist.
Wait, scratch that. Perhaps a little more leg would make me seem like a keen strategist, after all.
Find me on Instagram as @PurpleCar_cc (the bastards wouldn’t let me recover my original PurpleCar account). My recent selfies are the photos you see in this blog post.
This may sound like a “Don’t hate the players, hate the game” proposal. It isn’t. In all this renewed talk against so-called “mommy bloggers” stirred up by a recent essay, we must turn a critical eye toward the culture and the system that elicited a constant supply of the “harrowing personal essay.” What really has encouraged moms and other writers to lay it all out online?
Feed me, Seymour
Let’s all agree on these basic tenets: Writers need to write, and the internet needs to be fed.
We understand, inasmuch as we understand the strange species called “writer,” that writers are compelled to write. They’d write on the walls with their own blood if you took away their tools. But what exactly does the internet feed on and why is it so starved for it?
Back it up
In the early ARPANET/University/Military era, the internet wanted more server connections. In the middle days, starting in earnest with the invention of the World Wide Web in 1996, the internet wanted one more thing: attention. Behind the endless appetite for readers and clicks was, of course, money: Profits for service providers; earnings for website hosts; sales for advertisers. The internet and the web are run on this rule and this rule only: “attention=money”
The early blogging culture was sooooo inviting. Intoxicating, really. Readers were welcoming. We linked to each other’s blogs. We commented. When we shared, we connected. The rush of sharing-then-connection raised the “worthy of reading” bar more and more. Even I, the cynic, the techie trained to be suspicious of all things internet, fell into the vortex a little bit. Here’s an abbreviated timeline of my own online experience:
1979: First personal computer in our house.
1988: I started online in forums. I had an anonymous username, as did everyone else, and sharing was über-encouraged.
1995: I had a website (buried in taxonomy of a work server) but no username. My image and name were there for all to see. BUT: no sharing. It was a static site with pictures, almost like a very staid pre-LinkedIn page.
2004: I started a blog (this one), but again, with a username. For three years, I blogged about my personal feelings and experiences, but knowing I wasn’t totally anonymous (friends and family knew “purplecar” was me), I avoided the “harrowing overshare.” Working in technology, I definitely did my best to keep my husband’s and children’s identity offline. (Side note: because I was an early blogger and a woman, men referred to me as a ‘mommy blogger.’)
2007: I deleted the past 3 years’ posts because I didn’t feel comfortable sharing those inner thoughts any longer. My subject matter, the Psychology of Technology, has stayed steady ever since. Rarely do I mention my children on any of my channels. I keep family life private.
Here’s a (very) abbreviated timeline of blogger events, with an emphasis on moms:
2010ish: Bloggers enraged the Bitternet and even the more sane types for not revealing their paid reviews. At some point, federal regulations dictated all bloggers must reveal any income sources that may affect the reader’s impressions. For example, if a mom received a free sample, she had to publicly acknowledge it in the post about that product. Also, bloggers were compelled to reveal if any link was an affiliate link right next to or in the text of said link.
2016: A can of hate is opened up on moms-who-write-about-their-families, spurred by a recent article by Elizabeth Bastos entitled “Why I Decided to Stop Writing about My Children” in the NYT’s “Well” Blog. In summary, the author discovered she was betraying her children’s privacy and trust by writing about them online. Lots of hand-wringing, finger-pointing and pearl-clutching ensues.
Yes: hate the game. But name the game first
I linked to the Bastos article on my social channels because I’m a big believer in its basic message: be mindful of a child’s right to form their own identity, free of a forever-documented personal history. I think we Gen-Xers grew up without the internet’s blood sucking; as parents we can do our best to give our kids at least a semblance of that same safe space in which to experiment with different identities.
In hindsight, I realize my sharing of the article may have inaccurately portrayed my beliefs about parents writing about their kids. Admittedly, trust between a parent and child is a sore spot for me, as it is for many of my peers. My current rage isn’t directed at parent writers, though, at least not fully. 95% of my rage is for the invisible machine that entices and then consumes the humanity in all of us. Why aren’t the for/against-parent-blogging camps talking about the economy-driven internet? Why does the machine get a free pass? (and why are women seemingly always dumped on the most?)
We need to name the machine. It wants us, it needs us, to bare our souls. Because everybody knows: souls sell best.
Image Credit: Me. They may or may not be my children behind the blur. That is the real Liberty Bell, though.
2 definitions you need for this post. I coined “Bitternet” … because someone had to name that phenomenon.
(noun) – usually in plural form “normals”
A human in a society who remains mostly unaffected by or not exposed to any particular trend in technology/technology’s subset of social, online sharing or gaming; not an early adopter, but not necessarily an anti-technology “luddite” <the normals don’t care about infosec>
an online, electronic communications network of humans who react to any new development, post, or idea with immediate, harshly reproachful, biting, resentful, and summarily negative judgments. <the Philadelphia Bitternet will have a field day with the new bus schedules>
Boo-hoo, Bitternet! Pokémon Go won.
Pokémon Go is the first augmented reality (AR) massive game to be mass-adopted by normals. In an article posted on Tech Crunch today entitled “The Pokémon Go influence on new tech” Dr. Roger Smith, Chief Technical Officer of Disney property The Nicholson Center at Florida Hospital, surmises that the game has taken its place in history as the harbinger of a new frontier: “Suddenly everyone understands what ‘augmented reality’ means and how an artificial digital world can be mapped onto the real physical world,” Dr. Smith writes.
This is kind of a big deal.
The Bitternet was confined to two weeks at most. The overwhelming popularity of the game with normals silenced the knee-jerk vitriol in record time. Gaming in any sense is usually met with a fierce and constantly-fired up Bitternet, causing gaming and gaming issues to be viewed as fringe or niche subjects. But not with Pokemon Go, and many experts are scrambling to understand why.
I don’t know what to tell them other than: It’s time. AR is here to stay, albeit for a while in its current, very simple form. Even my spouse, who is (mostly) a normal (he’s been a video gamer and poker player for decades but not a social web person at all), admitted he’d play a Harry Potter version of Pokémon Go. Trust me, that’s groundbreaking. Something has changed, and it has changed for good.
Ingress, the early, user-populated version of Pokémon Go
I played Ingress, the previous AR game by Pokémon Go creator Niantic. Ingress is an extensive user-populated network of portals (the grid used for Pokémon Go – it was filled with suggestions for portals by users). Niantic made the game compelling by constructing characters, background information, and complex and regularly-updated story line that allowed players to truly immerse themselves in the game. Local and global communities naturally popped up. Niantic programmed in massive meetup events. Human connection became the key feature of the game. The game still rages on with a dedicated userbase. (iPhone battery usage was an issue for me, as well as the difficulty of leveling up. Ingress didn’t keep my attention as a result. But I did love it, and am considering going back to it now that the battery issue is improved.)
Ingress was definitely designed for gamers, though. The science fiction story one needs to adopt to enjoy the game is geared toward fans of the genre and gamers. Pokémon Go, on the other hand, is a Disney-ification of Ingress; it’s a game for everyone. The first several levels are quite easy and require little to no skill.
The key factor here, though, is the decidedly NON-gaming, NON-scifi feel of the play. Where Ingress is dark and futuristic, Pokémon Go is cartoonish and bright. Ingress contains a plot point that implies danger to humanity/our way of life. Pokémon has cute monsters who battle each other for, basically, the entertainment of humans.
PokéMOMs (do I really need a definition for this?) know the game is something they can do with their kids. Tweens to young adults aren’t turned off by the presence of those Pokémoms out at the gyms, battling away with their amped-up Lapras ‘mons. Even the Bitternet (my term for the knee-jerk, mass complainers on social sites, see definition above) has dialed down their hate, being forced into submission by the sheer number of players and advocates.
What the Bitternet should really use their evil powers for
In the TC article, Dr. Smith goes on to postulate how businesses can use augmented reality games. Being a Disney employee (or, so I assume by Florida Hospital being located in Celebration, Florida, a Disney property), Dr. Smith naturally cites Disney’s in-park game where players can pursue Disney characters (which are overlaid onto a phone’s camera view). “The entertainment giant can decide when and where these virtual characters appear, contributing to crowd control, restaurant business, gift purchases and a richer experience in the theme park,” Dr. Smith writes.
“Richer experience in the theme park” I take to mean, richer for Disney. All Mickey has to do is lead a kid into a store and point at something and say “BUY THIS FOR MORE MICKEY POINTS!” I’m sure that isn’t the meaning the writer intended, but it sure is a revealing Freudian slip, isn’t it?
All of the Bitternet’s “critical eye” (more like cynical but I’m trying to be polite) could be useful when it comes to business and AR games. We could use a check and balance on all things commercial. AR games are meant to enhance personal connection, not the wallets of megacorporations. Reminding the world of that will be a constant battle no Pokémon can win alone. We’ll need everyone to pitch in to keep gaming for normals fun, pro-connection and safe for our own wallets.
Photo Credit: Me. That’s a screenshot of the famous Pokémon I caught yesterday, thanks to my husband’s help as he drove me a few feet away from the direction we were supposed to go in to look for this Pikachu.
Definitions were based on Merriam-Webster.com’s definitions (of various words). The yearly subscription to the site’s Unabridged Dictionary is the best money a writer can spend.
We’re all quite familiar with the crash-and-burn of the music industry. They went down screaming, too, didn’t they? Suing individuals for millions for sharing music online (in those very early days!). Whining about how consumers weren’t buying albums anymore. We were those consumers, and we didn’t shed much tears for them, did we? After the lawsuits, we shared more music online. We refused to buy albums full of crap with only one (or, if you were lucky) two decent tracks.
The music industry adapted to consumer’s anger about mostly-bad albums by first selling singles on cassette tapes, then selling singles online, most notably via iTunes. Online/Satellite Radio with a subscription+little-to-no-advertisement model is becoming the preferred listening method of consumers in the US.
Consumers are saying a few things with their behavior. 1. They are willing to pay for good, quality work 2. They are willing to pay to stop ads. 3. They want more control over what they pay for. 4. Contrary to popular marketing belief, consumers want a lot of choices (I’m sure a critical mass/upper limit exists, but that ceiling is a lot higher than what we run into now).
The newspaper industry is a lot like the music industry: fill up a product with crap, include one gem, then essentially make consumers pay full price for that one gem. Also: treat consumers as one, big mass instead of individuals with unique tastes.
The music industry is addressing all of these aspects. We can now buy exactly the song we want. We have plenty of choices on all the who, what, where, when, why and how we buy music. The music industry is finding other income streams, mostly from advertisers, by selling rights to songs. (They’ve always done this but now it’s almost immediate – you can hear the same song on streamed radio then hear it in a McDonald’s commercial on TV that same day).
Newspapers definitely need the subscription money. No doubt about this. As John Oliver said last week in his now oft-quoted rant,”Sooner or later, we’re either going to have to pay for journalism or we’re going to pay for it.”
Let’s get over that hump. Subscriptions are essential. How do subscriptions work for a digital news organization?
I’ve used the “flood” metaphor before in describing the information/data age of digital media. In a flood, you can’t simply drink the water that’s all around you. The water flooding the streets is tainted. Because of the flood water, the tap water is also tainted. If you drink from your regular water sources, you will become ill. You will have to collect rainwater yourself (very difficult and unreliable safe-to-drink factor), filter flood water carefully (takes a lot of effort and equipment), or get water from an outside source. Relief organizations bring tons and tons of bottled water and water dispensing trucks to flooded areas. In a natural disaster, I’d happily accept a bottle of water handed to me by a FEMA agent or a Red Cross volunteer, as the water they’d hand me would very likely be safe to drink.
Information is like water.
We news consumers will happily pay for “safe” (i.e. reliable) information. But just as a water bottle has to be handed off (and not floating in the contaminated flood water) to be deemed safe, consumable news stories can’t be floating around in a mess of unreliable data. Plus, I need to have some touchstone sources to go to for information. I won’t take information (or water!) from any old street peddler.
Paying for individual stories is an absolute essential. The price has to be in pennies, though. One song on iTunes is $1.29 and that’s hitting the upper limit of what people will pay. Print paper prices were also traditionally kept low: 50 cents to $1.00 per issue, perhaps more for Sunday editions.
Newspapers must set up user accounts with credits
I can pay NYTimes, say, $50 and use that credit cache to pay for stories until the balance is zero.
Journalists have to adapt. No, they don’t have to tweet. But the journalism world in general has gotten off-track. Some sort of populist, reductionist train has carted off all integrity of the field. A journalist’s job isn’t to only “report facts” – it is to sort out, for the reader, which side of the issue is more solid than the other.
Journalists must go back to thinking for themselves
The biggest conceit of journalism of the last 100 years is that its practitioners were “unbiased.” There is no such thing for humans. A journalist has and always will be a curator of facts. This is what we want them to be. It isn’t “editorializing” if, as a trained professional, a journalist presents the most compelling arguments from all sides of an issue. This isn’t what has been happening in journalism. Instead journalists have been beaten down into mere mouthpieces, forced by editors to write so-as-not-to-get-sued.
Consumers have been ignoring journalists as a whole for decades now. Instead, readers seek out journalists and writers by name. In the journalism world, these names are usually held by what are called “columnists,” writers with a regularly-produced column that concentrates on a certain area of expertise, e.g., economics (David Pogue), home economics/essays (the late Erma Bombeck), world politics, entertainment, chess playing, etc. Just as people would seek out the comic strips they liked and ignore the rest, readers will choose to search out the opinion of a columnist they trust, and they will pay to do this. But only this. They will give up on the columnist if you make them pay for other columnists they despise.
Corporations must pay for news they quote
As John Oliver points out, every media outlet quotes newspaper journalism pieces. How is it that Huffington Post, an entity that rakes in millions from advertising and selling its user data to those advertisers, can take those journalism pieces and quote them for free? This is ridiculous. We will need legislation to make this work, but entities like Buzzfeed and Huffpo must pay subscriptions or individual-use rights to newspapers.
Newspapers must be non-profits
To make such legal measures work, all newspapers must function and be designated as non-profits. The argument for non-profit status of traditional news industry organizations is long and somewhat complicated. I won’t list it all here. But I think it is an essential piece of this whole crisis, and, a key element of a health democracy. All of this “being owned by Comcast” or “A Disney Entity” crap is a dystopian megacorp capitalist hell and we should get journalists out from under that oppression ASAP.
The news industry can survive this. In fact, nothing I’ve said here is a new idea. I personally have been in meetings and chats with industry insiders where these measures were spoken about more than 5 years ago. The corporate hold (and mindset) has taken over newspapers and they are slow (understatement) to make changes because corporations like their mouthpieces.
We all need to pitch in to put this ship right.
Ask the NYTimes, Mother Jones, Washington Post, et al, to start up a pay-per-article credit system. Buy some credits & use them.
Tweet out, FB post, snap your favorite columnists and tell us why you like their perspective.
Hold Huffpo and other outlets to the fire if their articles contain no original work. Up your standards for what you consume, and hold those organizations to a higher standard.
Keep an eye out for legislation around these issues and keep an open mind when some is introduced. A free press is a key factor in the health of our democracy, and it’s important you don’t hate on any new legislative efforts to preserve it. Take your time to look through it first before you spew, please. Then, once you understand it, spew responsibly.