Showing posts with label thoughts from the shower. Show all posts
Showing posts with label thoughts from the shower. Show all posts

Saturday, March 05, 2016

We can't assume we'll have fewer expenses in retirement

Conventional retirement planning wisdom is that you'll have fewer expenses in retirement, citing the absence of expenses such as parenting, commuting, office clothes, etc.

I don't think this is a safe assumption because of telecommunications trends in recent decades.

For my entire lifetime, the trend has been towards new technologies that require a monthly subscription. Telephones developed touch tone and voicemail and call display.  TV moved from antenna to cable, with more and more channels and more complicated and expensive packages.  The internet became common in people's homes, requiring a monthly subscription. Cell phones became common, requiring another monthly fee, and then smartphones with significantly higher fees. There even seems to be an trend away from mp3s and towards music streaming services with monthly fees (which baffles me - that's like not listening to your albums any more and instead listening to radio exclusively, and paying for the privilege).

If you retired in 1995 at the age of 65, your budget may well not have included internet, cell or data. In 2015, you'd be 85.  There's a good chance you'd still be alive, there's a fairly decent chance you'd still have a reasonable amount of cognition and live independently, but internet, cellphones and data plans would all have become part of normal household telecommunications.  You may well not have even thought of these things in your budget when you retired in 1995, but you'd be increasingly deprived without them as you enter old age.

I see no reason why we can assume this trend would reverse, so when budgeting for the expenses of future decades, we have to assume additional, unforeseen telecommunications needs.

At this point, some people are thinking "surely we have enough telecommunications now - future telecommunications would be luxuries, not necessities."  (Some people also probably think the same thing about internet access, but I suspect they aren't reading my blog.)  But there's three things to keep in mind about this:

1. As new telecommunications technologies become more common, they become more part of the baseline. Think about how many of your income tax forms you have to download from the internet now compared with 1995. (Was it even an option in 1995?) Think about how many things you can't do without an email address.  You need a touchtone phone to access almost any business.  Already we have some services that can only be accessed with an app on a smartphone -  you can't order an Uber using a computer, for example.  So if you deprive yourself of future technology, you're making it increasingly difficult for yourself to fully participate in society.

2.  What we think of as nifty online services become increasingly valuable as a person begins to decline.  Grocery Gateway would be a saviour for someone who isn't mobile enough to get to the grocery store themselves!  Imagine if a person who is beginning to develop dementia could say "Siri, where am I supposed to be?" and Siri would know the answer and give them directions?  Or perhaps even do so pre-emptively to keep them from getting lost in the first place?  Or, on a 20th-century level, think of how with Call Display you can tell your grandmother "Don't answer if you don't recognize the caller" to protect her from scammers, and voicemail will make sure that she doesn't actually miss any important calls she might miss with aggressive use of Call Display.

3. Even if future telecommunications do end up being luxuries, your own retirement planning is not about some hypothetical senior citizen who isn't into technology or who you have unilaterally declared can do without luxuries. It's about you. Do you want to deprive yourself of self-driving cars or holodecks or playing games with your grandchildren via 4D Facetime or whatever the future holds?  Do you want budgetary considerations to put your 90-year-old self in the position of a person who today can't get to the grocery store themselves but also buy their groceries online?

Unfortunately, I have no idea how to anticipate what future telecommunications expenses will end up being.

Saturday, February 20, 2016

How to reboot Are You Being Served?

I was very surprised to hear that they're rebooting Are You Being Served? because that show is very much a product of its time and totally out of step with modern comedic sensibilities.

But then my shower gave me an idea of how this might be carried off.

Grace Brothers a struggling department store, conveniently located in London's West End so as to create a situation where all its sales staff are struggling actors, working in the store as a day job until they get their big break.

The sales staff are established as modern, relatable people - savvy, witty, reasonably worldly, aware of irony, texting and snapchatting, dressed like regular Londoners. They're also very good at their jobs as clothing salespeople, able to serve as personal shoppers and do alterations and bra-fitting and such, but in this modern world there's simply less call for this sort of service.

Then store management hands down a new dictum: in an attempt to boost sales and draw people back into the store, they're going for nostalgia. There is now a dress code - suits for the men, brown jumpers for the ladies, and all kinds of finicky rules about who's allowed to wear what kind of hat and how many frills you're allowed to have on your blouse. Staff are ordered to address each other as Mr./Mrs./Ms. Surname, and strict scripts are introduced, such as "Mr. Humphries, are you free?" and "Are you being served, Madam?"

The staff thinks this is ridiculous, so, being actors, they decide to make it a game. They see their new dress code as costumes, and start getting some character acting practice in when dealing with customers and management.  They do their job and do it as well as possible under the circumstances, but they do so while playing over-the-top roles and having a standing wager to see who can utter the most double-entendres. It's an ongoing improv game, creating foolish, outdated characters to go with store management's foolish, outdated vision. Also the fact that they're all actors creates an opportunity for song and dance numbers as sometimes occurred in the original - someone has an audition piece, they're yes-anding the fuck out of something that happens on the floor, etc.

Even as over-the-top improve characters played ironically, it would still take quite a delicate bit of writing to have the original Are You Being Served? characters work in the 21st century.  I mean, Mr. Humphries' whole schtick is that he has stereotypically gay mannerisms, and that's supposed to be intrinsically funny in and of itself. No competent writer or performer would think of that as a viable comedic choice in the 21st century!!

But that gives me the idea (which may or may not actually be a good idea) that perhaps the actors staffing Grace Brothers are not actually good actors.  (That's why they're working a struggling department store!) And the broad characters of Are You Being Served? are a result of their imperfect acting/improv skills. For example, Miss Brahms is a creation of an American actress who thinks she's speaking with a posh English accent, but it actually comes out Cockney.  Mrs. Slocombe is an attractive middle-aged woman trying to play a young hipster character, but her bold hair colours and makeup are actually unflattering and make her look even older than she actually is. Mr. Humphries is the creation of a Michael Scott type with no sense of judgement or appropriateness, but the character goes over well with customers (who have no clue that he's meant to be a joke and simply think he's fabulous) so no one stops him.

Or maybe that's what the original Are You Being Served? was doing all along...

Monday, December 28, 2015

The real fantasy of Pemberley is the servants

I've fallen down a bit of a Pride and Prejudice rabbit hole lately, exploring fanfictions and historical background information.

While I do enjoy poking around in the Jane Austen universe from time to time, unlike many Pride and Prejudice fans I never found Mr. Darcy particularly dreamy.  He proves to be kind and honourable and madly in love with the protagonist, all of which certainly come in handy, but doesn't have the je ne sais quoi that it would take to make me fantasize my way into Elizabeth's place.

However, in my recent revisitation, I realized the actual fantasy of being mistress of Pemberley isn't having Mr. Darcy for a husband - it's having Mrs. Reynolds for a housekeeper.

Mrs. Reynolds has been serving as Pemberley's housekeeper since back when Mr. Darcy's parents were still alive, and has kept it running smoothly even after his mother died.  This means that Pemberley can run smoothly without a lady of the house, but also knows how to accommodate a lady of the house.  So the position of mistress of Pemberley can be as much of a sinecure as its incumbent wants, and as much of an apprenticeship as she wants.

Being the mistress of a well-run estate is pretty much the most prestigious role in life that someone in Elizabeth's position could reasonably dream of. So imagine you end up in the most prestigious role in life that you can reasonably dream of, and the support team is in place to ensure that you will succeed. Even if you do nothing, the endeavours under your responsibility will succeed and you will get credit for it.  If you want to actually do the work, they can train you up so you can do it independently, and if you want to introduce your own ideas, they know how to adapt to that.

On top of that, the presence of Georgiana means that, despite the absence of a "lady of the house" for many years, the estate is equipped for there being a lady in the house.  They no doubt have a maid who knows how to do hair, an existing business relationship with a local dressmaker, a horse who is accustomed to a sidesaddle rider - all kinds of things that it would be convenient for Elizabeth to have in her home and less convenient to have to acquire from scratch.  On top of that, Georgiana probably knows a little something about being the lady of the house at Pemberley, but, since she isn't out yet, she doesn't officially hold the role (and would never expect to hold it in the long term since she'll likely get married and be mistress of her own household) so she wouldn't feel usurped by Elizabeth.  Mrs. Annesley probably also knows a thing or two about being mistress of a household since she's an upper-class lady and a "Mrs." herself, and part of her role is likely to prepare Georgiana for her future. But, at the same time, Mrs. Annesley is an employee, so she is incentivized to help Elizabeth succeed as well.

Compare this to the situation of Jane and Bingley, who are going to buy an estate of their own and start the Bingley dynasty from scratch.  Neither of them has ever run a large estate before (except for Bingley's time leasing Netherfield, which doesn't entail running the whole thing.)  Probably neither of them has ever hired a whole staff of servants before, and the servants they do hire won't necessarily know how to run the place optimally since no one has run that estate before (or, at least, not for the Bingley family). Caroline Bingley is around and has been mistress of the Bingley household (formally, because she's out), but because of that (and what we know of her character) she's likely to feel usurped, so she's less likely to be a useful resource for Jane.

None of this is hideous hardship, of course, because they are rich, but Jane does have to put in effort and diligence to succeed in her role as Mrs. Bingley, whereas all Elizabeth has to do to succeed as Mrs. Darcy is nothing. As long as she doesn't insist on overruling her experienced household with subpar ideas, she will succeed gloriously.

Thursday, November 19, 2015

New Rules: Natural Consequences Edition VIII

I was trying to brainstorm this one a while back, but a simple, elegant solution came to me in the shower.

12.  If you lie to someone about their own thoughts, feelings, motives or experiences, you have to shut up for 24 hours. You are not allowed to talk in the presence of the person to whom you lied about themselves during this time. If the lie was communicated by mass media or another non-verbal medium, you're not allowed to use the medium in question in a way that will enter their sphere of awareness for the next 24 hours.  So if you tweeted the lie, you can't tweet for 24 hours. If you mentioned it in a TV interview, you can't talk on TV for 24 hours.  (So if you're a politician campaigning, be careful when you say "Torontonians want X")

For every subsequent offence, this 24-hour period is doubled (e.g. 48 hours for the second offence, 96 hours for the third offence, etc.)

The person to whom you lied about themselves is has the discretion to permit you to respond to a direct query on a case by case basis, but if you lie to them during this time it counts as a subsequent offence, and the punishment for the subsequent offence is doubled.  Sentences are served consecutively. (e.g. If, during the 24-hour period following your first lie, they give you permission to respond to a direct query and you lie to them about themselves in your response, you have to serve another 96 hours after the first 24 hours expires.)


13. Sometimes, people who say assholic things claim that they're the only one brave enough to express that opinion, when in reality no one else is even thinking those assholic thoughts.

People who do this should be treated like they're too cowardly to do every single thing that it has never occurred to them to do, with whatever the attendant social consequences of not being brave are in their circle.

Wednesday, November 18, 2015

Paternity and participation

Just a few of the many thing that exist in the world:

1. People who think that a good sense of humour means not holding back when it occurs to you to make a joke, and that uttering every potentially-humorous thing that occurs to you, no matter how worthy or advisable, is laudable.
2. People who think that being present in your children's lives is sufficient to constitute good parenting.
3. People who think it's disgraceful that Kids Today allegedly get trophies for participation.

I've noticed that Category 1 seems to correlate with fatherhood, to the extent that really pathetic jokes that aren't even worth the breath it takes to utter them are called "dad jokes"

I've noticed that Category 2 seems to correlate with fatherhood, to the extent that people think the character of Cliff Huxtable is an exemplar of fatherhood solely on the grounds that he's seen on screen interacting with his children.

And, in my own experience, the majority of people (or, at least, the loudest segment) in Category 3 are men. I don't know how many of them are fathers, but most fathers are men.

So I find myself wondering how many people fall into all three categories, wanting kudos for mere participation in humour and/or fatherhood, but complaining when the same thing is offered to their children.

Wednesday, November 11, 2015

The Toronto Star ipad app problem

The Toronto Star recently came out with an ipad app, and they seem to be pushing it pretty hard, perhaps even prioritizing it over everything else.

The problem is that this renders some content inaccessible to people who don't have ipads.

If an ipad user tweets an article from the Star, it provides a link to the ipad version.  If you're reading on a computer, it doesn't autodetect that and direct you to the web version, or provide a link at the bottom to the full version like many mobile websites do. The ipad link doesn't always provide the full text of the article, and (so far, at least) when I've searched the Star website for the headline or the lede, it hasn't turned up anything. 

There have even been one or two times when an article is teased in the print version of the newspaper, and they tell you to go to the ipad version for the full story!

So it seems that there are Toronto Star articles that can't be read in the print newspaper, on a computer, on a non-ipad tablet, or on a non-ipad i-device. They can only be read on an ipad.

Which is not a negligible inconvenience for people who don't need or can't afford ipads!

An ipad costs several hundred dollars. (Currently, the prices in the Apple Store range from $329 to $1429.)  My experience with other Apple products has been that I can only get a few years of use out of them, and I see no indication that this would be any different for ipads.

So the Star is creating a situation where, to get access to all the journalism in your local daily, you need to pay at least $100 a year to another, unaffiliated corporation for a device that you may well have no other need for.

Do the owners of the Toronto Star own Apple stock?

I'm also wondering how this will affect googleability and archivability. Since I can't seem to get at them via web, it seems they aren't googleable. Can you access old articles in the app, or does it give you solely today's content? (I have no idea, because I don't have an ipad.)  Do they turn up in library periodical indices so they'll be available to people doing historical research in the future?  If the Star writes an article about your kid's awesome science project or gives your play a glowing review, is there a way to save the article for posterity? Even after the ipad is obsolete?

It's time for a more realistic First World War narrative

A while back I read an article (which I'm kicking myself for not bookmarking!) postulating that the people of Great Britain were so psychologically traumatized, individually and collectively, by waste and horror and pointlessness of the First World War, that society collectively imposed a meaningful narrative upon it. They just couldn't cope with the idea that all this waste had been for nothing, so over the "Never Again" message intended by the creators of Remembrance Day, they superimposed glamorous sepia-toned Dashing Young Heroes, Fighting For Our Freedom.

That explains so much!

But, while I do thoroughly empathize with the need to control your narrative to get through the day, it's getting to be time to retire that narrative.  The last surviving WWI veteran died in 2012, at the age of 110. If there are any WWI survivors left in the world, they're pushing the century mark and, because they were so young at the time, may not even remember the war.  We're either approaching or have already passed the point where there's no one left whose psychological trauma needs to be attended to with this more-meaningful narrative.

The 100th anniversary of the end of WWI is coming up in a few short years.  Think pieces will be written. All we have to do is not include the sepia-toned heroism in the think pieces.  Talk instead about waste and tragedy. Talk about how it didn't even need to be a war, even by the standards by which things sometimes need to be a war.  Talk about how the ignorance of eager young recruits and the short-sightedness of governments led them to charge in, expecting a Jolly Good Adventure, with no idea what they were getting into. Talk about how this all destroyed individuals and families and communities and societies and physically broke Europe and created the conditions that gave rise to nazism.  Maybe even talk some more about how people at the time had to impose a narrative of meaning and purpose to cope with their psychological trauma.

The unfortunate side effect of this imposition of a narrative of meaning on WWI has been that the waste, horror and pointlessness are not as much at the forefront of subsequent generations' minds as they should be. This creates a situation where subsequent generations are just as ignorant as the WWI-era recruits and governments who charged in expecting a Jolly Good Adventure and ended up in hell. And this ignorance may well affect decisions about whether to get involved in future warfare - thinking that WWI had purpose affects our mental ratio of "purpose vs. pointlessness of war", so we might be more likely to see purpose (or assume there must be purpose even though we can't see it) in a potential future war.

By restoring a more accurate narrative of pointlessness and waste, we'll reduce the chance of making the same mistakes in the future, which is the best way to honour all those who were killed or destroyed in or by the First World War.

Saturday, November 07, 2015

What if one day they'll accommodate intellectual disabilities like they do physical disabilities?

One of the reasons why I'm so obsessed about pensions is, if my grandmother's trajectory is any indicator, I'm looking at nearly 20 years between when dementia makes it impossible for me to work and when I finally die. 

In the shower this morning, it occurred to me that, with the aging population and declining economic security and employment quality, I'm not going to be the only person in this situation.  And, since the baby boomer generation will pass through this before I do, I'm not even going to be one of the first people in this situation.

What if all these factors align to create a society where dementia (and other intellectual disabilities) are seen as disabilities to be accommodated in the workplace, and the mechanisms for accommodation become common knowledge.

It sounds impossible now, but, (at least to those of us who aren't up on such things) most disabilities sound impossible to accommodate until someone figures out how and people get used to seeing it in action.  (How many of us would have thought of a seizure response dog, or running blades?)

With today's technology, I could continue to work as a translator if I lost my eyesight or if I lost my hands. And society is moving on a trajectory from less accommodation to more accommodation. Maybe one day they'll figure out a way to let me continue to work if I lost my mind.

Thursday, November 05, 2015

Things They Should Invent: objective quality and maintenance standards for official residences

With the change of government and arrival of a new Prime Minister, 24 Sussex Drive has been in the news again.  Apparently it's in very poor condition and in need of extensive repairs, renovations and upgrades, but successive Prime Ministers have been reluctant to have the work done because they don't want to be seen spending public money on their residence.

A solution would be to set objective standards both for the quality level that needs to be maintained and the amount that needs to be invested in upgrade and renovating the building.  These standards would be set by people who are experts in building maintenance and heritage preservation, without any involvement by political leaders, so the Prime Minister (or, whenever possible, the National Capital Commission) is just following the rules.

As a starting point, here's a basic framework my shower gave me:

1. Baseline state-of-good-repair standard: This is your basic health, safety, functionality, and "this is the 21st century" standard. If the building doesn't meet this standard, it is to be immediately brought up to standard regardless of the price.  For example, the building needs to be free of asbestos and other poisons, have no leaks or infestations, warmer than 20 degrees in the winter and cooler than 25 degrees in the summer, etc.  Could be based on or inspired by similar existing standards for rental housing, public buildings, etc.  The decision to carry out these repairs is made without the involvement of the Prime Minister or their family, similar to how tenants often get notices from their landlords saying "We will be turning off the water for three hours on Tuesday to repair a leak." The baseline state-of-good-repair standard is reviewed and updated at a fixed interval, by non-political people who are qualified to make this kind of decision, to make sure it still reflects modern baseline expectations for housing and public buildings.

2. New resident refurbishment allowance: Every time a new Prime Minister moves in, they are permitted to spend a certain legislated amount of money adapting the house to their family's needs.  One option is that they're allowed to spend up to a certain limit on changes from a list approved by the National Capital Commission.  An option with less political fall-out (inspired by employers who give employees on business trips a per diem rather than having them file expense receipts) is to simply hand over the allowance, have the National Capital Commission provide a list of what changes are and aren't permitted, and the Prime Minister's family can do whatever they need to.  It might actually be more efficient that way by saving on red tape justifying why they need to paint this room yellow or put heavier curtains in that room.  The amount of the new resident refurbishment allowance is reviewed and updated at a fixed interval, by non-political people who are qualified to make this kind of decision, to make sure it still reflects the needs of a family moving into a new home.

3. Regularly scheduled renovation/upgrade fund: A set amount of money is available at a set interval for whatever renovations/upgrades the building needs most, beyond state of good repair.  The renovations/upgrades are decided jointly by the National Capital Commission and a representative of the current Prime Minister's household. (The optics would be better if there's a housekeeper or someone like that who is very familiar with how well the building works and fulfills its functions but doesn't benefit personally from any upgrades, but if there isn't any such person any resident would do.)  The amount of this fund is reviewed and updated at a fixed interval, by non-political people who are qualified to make this kind of decision. Depending on the amount and the frequency with which it is used, it may be permissible to bank it for later use, or borrow from the next round, if a major expense should arise.  Not every Prime Minister's household is necessarily involved in using this fund - just whoever happens to be Prime Minister when the time to use the fund rolls around. For example, if the fund is only used in years ending in 3, then Jean Chrétien's household would have used it twice (in 1993 and in 2003), but Paul Martin's household never would have used it.

In addition to these amounts, the Prime Minister's family is permitted to spend their own money as long as the changes they make meet the approval of the National Capital Commission.

Because the quantities and frequencies of investment are legislated (or, at least, set out in some kind of official policy), it wouldn't be the Prime Minister's fault that the money is spent - the rules are just being followed.  (The rules could be written in such a way that it is the National Capital Commission that is required to spend the money, not the Prime Minister's household.)  And this heritage building would be kept in decent conditions and be able to fulfill its official and ceremonial functions without being a source of national embarrassment.

This framework could also be used for other official residences, just replace "Prime Minister" with the dignitary who resides there and "National Capital Commission" with the organization responsible for managing the residence.

Tuesday, October 13, 2015

Things They Should Study: how does the gig economy affect productivity?

In economics, they often talk about productivity, most often bemoaning the fact that it isn't high enough.

I wonder if anyone has studied the effect of the gig economy on productivity? Because it seems like it would have a strong negative impact.

For example, a freelance translator has to not only translate, but also handle marketing, advertising, billing, online presence, inquiries from prospective clients, and all the administrative aspects of running a business of which I'm unaware. In comparison, a staff translator spends nearly all their time translating, and their employer's administrative staff deal with most of the rest of that stuff.  So it's easier for the staff translator to be more productive.

I'd imagine the same would hold in most occupations.  And, on top of that, the shorter the gig is, the less productive it is.  If industry standard is six-month contracts and then they transition to three-month contracts, workers have to spend time looking for work (rather than doing work) twice as often, and employers have to spend time hiring twice as often.  More and more person-hours are being spent on the non-productive tasks associated with connecting people with work rather than simply spending the time on work.

I wonder if anyone has yet studied this enough to quantify it?

Thursday, September 17, 2015

The first jokes

Q: Why did the chicken cross the road?
A: To get to the other side!
That was the first joke I ever learned, when I was maybe 3 or 4 years old.  Before I learned that joke, I had never even heard of the concept of a joke. While googling to learn about its age and origin, I was surprised to discover that it is in fact an anti-joke. Expectations at the time were that the answer would be a humorous punchline, not a simple, practical statement of cause and effect.

But, just like "to get to the other side" was once novel and revolutionary, the basic riddle/joke format of asking a person a question to which you expect an answer so you can give a humorous answer instead was also once new and revolutionary.  Someone, somewhere in history, was the very first person to do it.  And someone was the very first person to think of it!  They not only wrote the joke (and it must have been a good joke for the format to persist), they also thought of the whole format.

I wonder how that very first joke went?  Since it was unprecedented, the person being told the joke probably gave a serious answer to the question, having no way of knowing that anything else might be expected.  Did they get the joke?  Did they think it was funny?  They might not have since it was so unprecedented (and they might feel a bit perplexed or made a fool of because they thought they were being asked for actual information and acted accordingly), but enough people thought it was funny that it stuck.

Q: Knock knock!
A: Who's there?
Q: Boo!
A: Boo who?
Q: Don't cry, it's only a joke!

That was the first knock-knock joke I remember ever being told, also when I was around 3 or 4 years old (probably the same day I learned about the chicken - I have fuzzy memories of a revelatory day when I learned all about jokes), and I didn't get it because I didn't know that "boo hoo" was meant to be onomatopoeia for crying.

Knock-knock jokes also require precedent to function, since they require audience participation. I don't remember being explicitly taught the script, but I must have either been taught it or seen it repeated on TV.

But someone thought of the knock-knock joke, and taught someone else the script so it would function (or, like, wrote it into a play or something).  And, somehow, it stuck!


Someone made the first pun.  Someone was the first to use sarcasm. (And, possibly, someone else was the first person to use it successfully.) Someone was the first to fake farting on the grounds that they thought it was funny.  Someone thought of and carried out the first practical joke.  (It may even have been one of our primate ancestors - I'm sure at some point a monkey has slipped on a banana peel!)  And all of these stuck, and got perpetuated.

What's even more interesting is that an unknowable number of other types of jokes that we've never heard of must have been thought of and attempted throughout human history, but they didn't stick because they weren't funny enough.

And this is still going on!  It's quite possible that right this minute, somewhere in the world, someone is thinking of and attempting an all new type of joke that no one has ever thought of before, only to fail utterly.

It's also quite possible that right this minute, somewhere in the world, someone is thinking of and attempting an all new type of joke that no one has ever thought of before, and it will succeed and spread!  Memes (in the sense of pictures with words on them that circulate on the internet) were developed within my adult life.  Those videos where people caption a Hitler movie to reflect current events started after YouTube was invented, so that's within the last 10 years (and quite possibly much more recently).  Someone might, this very minute, be inventing the next knock-knock joke, which, decades or centuries from now, will be retold by a preschooler who has just learned of the very concept of jokes.

Monday, September 07, 2015

The first beauticians

Someone, at some point in human history, was the first person to cut hair.  Maybe they didn't even cut it - maybe they they just broke it off by hand, and later had the idea of applying blades or sharp stones or whatever.

And then, someone was the first person to cut hair for aesthetic reasons (rather than just because it got in the way or "Hey, let's see what happens!").  And someone was the first person to figure out that if you cut it a certain way it will fall a certain way. Someone invented bangs.  Someone invented layers.

Someone invented braiding.  I don't know if they first did it with human hair or to make rope.  The idea of weaving strands together so they'll stay put in a single, cohesive whole had never before occurred to any human being, but someone not only thought of it, but also figured out how to do it.

Someone invented the idea of tying or clipping hair back.  It seems glaringly obvious, but someone must have been the first (even if it was just the first human being who also had their hair get in the way.)  Someone figured out the idea of a hair tie. Someone figured out the idea of a hair clip. Someone figured out a bun, and someone figured out that if you stick a stick through a bun it will stay.

Someone invented shaving. They came up with the idea of scraping the sharp thing along the skin to remove all the hair rather than cutting the hair further from the skin or pulling it out at the root.  Some came up with the paradigm-shifting idea that not having hair where hair naturally grows might be aesthetically superior to one's natural state.  Someone came up with the idea that if you apply stinky gunk to body hair and press a piece of cloth on it and pull the cloth out, the hair will come out.  Someone came up with the idea of inventing chemicals that would cause the hair to just fall out.  Someone though of zapping the hair with electricity and with lasers.

Someone was the first to think of dyeing hair.  Actually, someone was the first to think of dyeing anything. Before that, it never occurred to anyone that you could change the colour of stuff!  Or maybe they stumbled upon it by accident - fell into a vat of blueberry soup or something.

Someone invented piercings.  "So what I'm going to do is stick a sharp thing through your flesh to make a hole. Then you can put shiny things in the hole. It will be pretty!"

For that matter, someone invented jewellery. Someone was the first person to think that wearing shiny things is pretty, and everyone agreed!

Someone invented tattoos.  Someone thought of the idea of drawing something on their body permanently, and someone figured out that if you stick ink in your skin with a needle it will do just that.  Or maybe they didn't intend it to be permanent and it was all an accident! (Although that wouldn't explain why they were sticking ink-covered needles in skin in the first place.)

Thursday, August 20, 2015

A better approach to ethical objection by doctors

I've blogged before about the mystery of doctors who choose to practise in a certain field of medicine even though they morally object to an integral part of that field of medicine.  Surely they should have seen it coming that they'd be called upon to do the thing to which they morally object (in the case that inspired that blog post, prescribing contraception when working in a walk-in clinic) and surely they should have chosen a different field of medicine if they objected to this.

But with the eventual legalization of physician-assisted dying (as they seem to be calling it now) in the news, I see a situation where the doctors literally didn't sign up for this.  It's quite possible for someone to have become a doctor without having seen it coming that they could be called upon to deliberately end a life. 

So in the shower, I thought of a simple guideline that balances physicians' ethics, patients' rights, the "they should have seen it coming" factor, and the "they couldn't have seen it coming factor."

Doctors should be required to provide all procedures and services that were usual and customary in their field and their jurisdiction at the time when they begin practising.  However, doctors can be permitted to opt out of only those procedures or services introduced after they began practising. The time when they "began practising" can be defined as either the time when they began their medical training as a whole, when they began their training in that specialization, when they graduated, when they began (or completed) their internship or residency - whatever the medical profession considers the optimal point in time.

So if you became a general practitioner in 1951, you can opt out of prescribing birth control pills on moral grounds. If you became a general practitioner in 2015, you had fair warning that you'd be called upon to prescribe birth control pills, so if you'd find that prospect morally objectionable, you had plenty of time to plan your career in a different direction.

If you became a doctor in 2007, you can opt out of physician-assisted dying on moral grounds.  If you become a doctor in 2020, you'll have fair warning that you might be called upon to help people die in whatever specialties end up providing that service, so if you don't want to provide that service you can specialize in podiatry or obstetrics or something.

If a doctor changes specialization or changes jurisdictions, they're required to provide all the usual and customary procedures and services at the time of their transfer. The reasoning here is they have an opportunity to research what they're getting into and plan accordingly.

This will also make it easier for patients not to get stuck with doctors who won't provide the service they need.  Patients can simply look up the doctor in CPSO or their jurisdiction's equivalent, and see when they began practising.

This way, the proportion of doctors providing a potentially-controversial service or treatment will always increase and never decrease. The acceptance of services among doctors (and therefore their availability) should mirror the acceptance of services among society (and therefore demand).  After a transitional period, patients won't ever find themselves stuck with a doctor who is morally opposed to a usual and customary service or treatment in their field. But, at the same time, no doctor is required to provide any service or treatment that they didn't know they were getting into.

Wednesday, July 01, 2015

Why young LCBO workers still card me?

One of the things we did in my Sociolinguistics class in university was analyze print advertisements.  While analyzing an ad for some kind of beauty product, the prof asked us who the target audience is.

"Women at the age where they are just starting to see fine lines on their face," said one of my classmates. 

"And what age is that?" asked the prof?

"Late 20s," said one of my older classmates.  The other older classmates and the prof all nodded and murmured assent.

I was rather surprised that the beginnings of wrinkles would turn up while you're still in your 20s, but, being only 19 years old myself at the time, I had no actual frame of reference.

My own fine lines began showing right on schedule,  at the age of 27.  And, since I became aware of them, I also began noticing the presence or absence of lines on other people's faces.  I must have seen people with fine lines before, probably including those of my classmates in that Sociolinguistics class who could attest expertly to when fine lines start making their appearance, but it was never a factor that I took specific note of when processing a face as a whole.

It occurs to me that this might be the answer to the mystery of why younger LCBO workers keep carding me when older workers stopped long ago!

If the younger LCBO workers are like my younger self, they might not notice my fine lines as evidence that I'm no teenager.  But they'd be more likely to notice my acne since they've most likely been through acne themselves.

Similarly, people who haven't started greying yet might not notice my few individual grey hairs (I didn't notice random strands of grey on people who were anything less than salt-and-pepper before I started greying myself), but the fact that my hair is long, which is culturally marked as youthful, is readily apparent to anyone of any age.

Friday, June 12, 2015

Things They Should Invent: train PSWs in feminine facial hair removal techniques

A while back, I came up with the idea that nursing homes should provide free esthetics services so female patients don't have to deal with the indignity of facial hair.

Today, my shower gave me a far simpler idea: PSWs should be trained in hair removal methods that are appropriate for women's facial hair.


By general societal standards, removing facial hair is seen as more optional for men than for women. PSWs are trained in the more-optional removal of men's facial hair, so they should also be trained in the more-mandatory removal of women's facial hair.

As we know from our own firsthand experiences, tweezing out your yucky chin hairs is more of an everyday personal grooming thing that you do in your own bathroom rather than a specialized beauty treatment for which you go to a beautician.  Therefore, it should be treated as such and be part of the patient's everyday personal care done by their PSWs.  (Yes, beauticians do provide more hardcore facial hair removal services.  Barbers will also shave clients if asked, but male patients get shaved by PSWs rather than having to pay to go down to the hairdresser.)

Some will argue that PSWs are already trained in shaving and that's a hair removal method.  But it's not the a correct, appropriate, suitable method for women's facial hair. Shaving results in same-day regrowth and stubble (especially on hairier-than-average people - and any woman with facial hair is hairier than average), which means that the socially-inappropriate facial hair problem will return before the end of the day.  Removing the hair at the root means the removal will last several days and grow back more gently and less visibly, allowing the patient to retain her dignity for longer.

And that's what this really is - a question of dignity.  Tweezing or threading or otherwise removing the hair at the root spares female patients the indignity of facial hair and the indignity of suffering through the masculine-marked process of having their face shaved. PSWs are trained to retain as much as patients' dignity as possible when bathing them, dressing them, toileting them, feeding them, moving them - every single area of daily life.  This should include the removal of unsightly facial hair.

Tuesday, June 02, 2015

Could working-class women dress themselves when upper-class women couldn't?

At certain points in Western history, aristocratic women didn't dress themselves.  They had their maids help them.  Based on what I've absorbed from the ether, they weren't necessarily able to dress themselves either, because of the design and complexity of the clothes.

For example, there's a scene in Downton where Lady Mary is going away for a weekend tryst, and she and Anna are looking through her wardrobe making sure that everything she packs is something she can put on all by herself (implying that she can't dress herself in all her clothes independently).  And this is in the 1920s when clothes were easier - in the Edwardian and Victorian eras, with corsets and crinolines and everything, it would be even more difficult to dress oneself.

I also recently read a book that mentioned that Edwardian upper-class ladies would wear tea gowns in the afternoons because that's when they met with their lovers, and tea gowns were something that a lady could put back on herself (implying that she's not able to put on her other styles of dresses herself).

This makes me wonder about the situation for working-class women.  Even if their dresses are more practical, the maids on Downton still have corsets and petticoats before the 1920s.  (In fact, there was a brief period where the aristocrats were wearing the newer, more comfortable uncorseted dresses, but the maids - who had to do actual physical labour - were still in the old corseted dresses!)  Could they dress themselves, or did they have to help each other dress?  What about Daisy, who woke up before anyone else in the house?  What about Mrs. Hughes and Mrs. Patmore?  Did one of their subordinates see them in their underthings every morning?  What if a working-class woman lived alone?  If a household consisted of just husband and wife, did he have to learn how to do up a corset?

Sunday, May 31, 2015

Teach me about the connotations of Orange County, California in the 1980s

When I was in elementary school (between 1985 and 1991),  this story-teller sort of guy came to our school and told us some stories.  When it came time to tell us the last story, he said we could choose between two: one was about a boy and his pond, and the other was about a big-city thief.  His tone and delivery suggested that the boy and his pond story was idyllic (and, by extension, boring) and the big-city thief story was exciting. My schoolmates overwhelmingly voted for the story about the thief, so he told us that story.

Afterwards, there was Q&A session, and someone asked him if anyone actually asked to hear the story of the boy and his pond, and he replied that it had happened once, in Orange County, California. His tone and delivery suggested that if you knew anything about Orange County, California, you'd understand why this was and perhaps find it humorous.

Of course, as an elementary school student in southern Ontario, I didn't know anything about Orange County, California.  In fact, I still don't.  This memory came back to me in the shower this morning so I've been doing some googling, and I still can't figure out any characteristics of Orange County that would make it clear why students there in the 1980s would prefer to hear a story about a boy and his pond. 

Anyone have any insight?

Saturday, May 16, 2015

The folly of condemning a boycott

There was recently a story tweeted into my feed about proposed "zero tolerance" for boycotting Israel.

This reminded me of something I've seen in US contexts: when there is a boycott of a business because of its business or labour practices, there are some commentators who say it's unethical to boycott the business in question.

This is ridiculous and unworkable.


I want to make it clear, I don't have a horse in this race.  To the best of my knowledge, none of the products I regularly buy or consider buying are from Israel.  All the cases I've heard of where people are talking about boycotts as though they're unethical have to do with US retailers that aren't available to my Canadian self.  I don't even have an opportunity to make these decisions, so I'm writing here solely as an external observer.  And as an external observer, I just don't see how boycotting could be unethical or something that you could have "zero tolerance" for, because of the very nature of a boycott.


What is a boycott?   It's choosing not to deal with a person or organization because you oppose some action or policy of theirs. (For syntactic simplicity, in this post I'm going to talk about boycott in terms of choosing not to buy from somewhere, but this can extend to all types of boycott.)


 So if boycotting is unethical or punishable, that would mean that, in order to behave ethically or to not be punished, you are required to buy from them.

And that's clearly unworkable.  The vast majority of people don't buy from the vast majority of sources the vast majority of the time.  Sometimes there's a better source, sometimes there's a more affordable source, sometimes there's a more readily available source, sometimes we simply don't need or want or can't afford the product in question.  If you're going to condemn people for not buying from somewhere, you'd have to condemn nearly everyone in the world.  (And on top of that there's the question of people who have bought from there but not recently. How do you tell if they've moved from buying to boycotting or if they just haven't needed to buy anything lately?)


At this point, some of you are thinking I'm oversimplifying things. After all, a boycott isn't simply not buying from somewhere, it's making a concerted choice not to buy because you oppose the source's policies and/or actions.

So let's follow this to its natural conclusion. If the anti-boycott people are okay with consumers simply happening to not buy certain products or services as a result of the natural course of their lives, but are opposed to us making the deliberate, mindful decision not to buy from certain sources to disincentivize them from behaviour we believe to be harmful, that would mean that the moral/legal imperative to buy from the source is triggered by the source's harmful behaviour.  If the source behaved in a way we considered appropriate, we wouldn't want to boycott them and therefore wouldn't be obligated to buy from them.  But as soon as they engage in behaviour we find unacceptable, we're obligated to buy from them in order to avoid engaging in the allegedly immoral/punishable act of boycotting.

Which is, like, the exact opposite of how market forces are supposed to work.  (Noteworthy because, I've noticed, many of the people saying boycotts are unethical seem to value market forces otherwise.)

Sunday, May 03, 2015

Things They Should Study: does the societal move away from print newspapers affect how informed kids grow up to be?

I've blogged before about how a lot of my basic understanding of medical and political concepts comes from my lifelong habit of reading newspapers, and how my lifelong habit of reading newspapers comes from having them around the house when I was growing up.

This wasn't a result of parenting, it was a result of incidental proximity. My parents didn't try to get me to read newspapers are part of education or child-raising, they just had them sitting on the kitchen table for their own use.  I just started rummaging through them in search of comics, moved on to adjacent features like advice columns and lighter news, and by middle school I was reading the local daily every day.

I wonder how this will play out for future generations as more people move away from print newspapers?

Even if the kids' parents read newspapers electronically, that doesn't leave as much opportunity for casual discovery. If everyone in the household uses their own devices, there's no opportunity whatsoever.  If they have shared devices the possibility exists, but it's still less likely.  When you finally get a turn with the ipad, you're going to use it for gaming or social media as you planned, not to go look at the boring news sites mom and dad look at.  And with the move away from web towards apps, casual discovery is even less necessary because it's seen as a separate app.

Older kids will have the opportunity for casual discovery through social media, but I feel like that's not the same as the casual discovery you get from a newspaper. As I've blogged about before, I find that I read more articles in print that it would never occur to me to click online.  I also find that my social media serves as more of an echo chamber, reiterating and going into greater depth on my own opinions and interests.  Both of them have their function, but I feel like I'd be far more ignorant without the newspaper habit.

Of course, it's quite possible I feel this way because newspapers are my baseline.  It's very easy for me to see ways that non-newspaper people are poorly informed by their lack of newspapers, but it's possible that I'm poorly informed in ways I can't perceived by not being more app-centric or something.

That's why I think it would be interesting to study how (and if) the absence of print newspapers (but with the presence of informed parents) in the house when kids are growing up affects their informedness as adults.

Sunday, April 12, 2015

The first tourist

In the shower this morning, it occurred to me that some one person in human history must have been the very first tourist, by which I mean the first person to travel recreationally.

For all of human history, people have travelled to find food or to flee problems where they were living before or to trade or to warmonger or to find new unused or conquerable land or for a quest or for a religious pilgrimage.

But recreational travel wouldn't have been a thing for much of human history, because travel was difficult and too many people were too preoccupied to survive. Plus, because no one had ever done it before, it probably wouldn't have occurred to many people to do it.

And then, someone, somewhere, came up with the idea of "Hey, let's go over there for no particular purpose, just to look around!  It will be fun!"  No one in the history of the world had ever gone somewhere for no particular purpose before!  But this person did, and somehow the idea caught on.