Monday, September 07, 2015

The first beauticians

Someone, at some point in human history, was the first person to cut hair.  Maybe they didn't even cut it - maybe they they just broke it off by hand, and later had the idea of applying blades or sharp stones or whatever.

And then, someone was the first person to cut hair for aesthetic reasons (rather than just because it got in the way or "Hey, let's see what happens!").  And someone was the first person to figure out that if you cut it a certain way it will fall a certain way. Someone invented bangs.  Someone invented layers.

Someone invented braiding.  I don't know if they first did it with human hair or to make rope.  The idea of weaving strands together so they'll stay put in a single, cohesive whole had never before occurred to any human being, but someone not only thought of it, but also figured out how to do it.

Someone invented the idea of tying or clipping hair back.  It seems glaringly obvious, but someone must have been the first (even if it was just the first human being who also had their hair get in the way.)  Someone figured out the idea of a hair tie. Someone figured out the idea of a hair clip. Someone figured out a bun, and someone figured out that if you stick a stick through a bun it will stay.

Someone invented shaving. They came up with the idea of scraping the sharp thing along the skin to remove all the hair rather than cutting the hair further from the skin or pulling it out at the root.  Some came up with the paradigm-shifting idea that not having hair where hair naturally grows might be aesthetically superior to one's natural state.  Someone came up with the idea that if you apply stinky gunk to body hair and press a piece of cloth on it and pull the cloth out, the hair will come out.  Someone came up with the idea of inventing chemicals that would cause the hair to just fall out.  Someone though of zapping the hair with electricity and with lasers.

Someone was the first to think of dyeing hair.  Actually, someone was the first to think of dyeing anything. Before that, it never occurred to anyone that you could change the colour of stuff!  Or maybe they stumbled upon it by accident - fell into a vat of blueberry soup or something.

Someone invented piercings.  "So what I'm going to do is stick a sharp thing through your flesh to make a hole. Then you can put shiny things in the hole. It will be pretty!"

For that matter, someone invented jewellery. Someone was the first person to think that wearing shiny things is pretty, and everyone agreed!

Someone invented tattoos.  Someone thought of the idea of drawing something on their body permanently, and someone figured out that if you stick ink in your skin with a needle it will do just that.  Or maybe they didn't intend it to be permanent and it was all an accident! (Although that wouldn't explain why they were sticking ink-covered needles in skin in the first place.)

Friday, September 04, 2015

When the blinking stops

On the front of my cable box is a digital clock, displaying the time in big green letters with the colon between the hours and the minutes blinking once per second.

Every once in a while, I happen to glance at the clock and the colon appears not to be blinking.

I find this hypnotic. 

My eyes are drawn to it like magnets. I cannot blink, cannot move, cannot look away. I stare and stare, mesmerized, until my eyes are about to begin to water.  Then, just as I reach the point where I can no longer fight off the urge to blink my eyes, the clock starts blinking again.


I've always said that one thing I'd wish for if I had a genie was a remote control that can control the passage of time.  I'm sure we've all had that kind of day where we'd love to press the pause button and take a nap.

Maybe these moments where the clock appears to stop blinking is a clue that someone already has one...

Monday, August 31, 2015

Books read in August 2015

New:

1. Thrown by Kerry Howley
2. Architecture in the Family Way: Doctors, Houses, and Women, 1870-1900 by Annmarie Adams
3. Rides (French translation, by Carole Ratcliff, of Arrugas by Paco Roca.  Weirdly, my library didn't have the original Spanish or the English translation, but it did have the French translation.)
4. The Paying Guests by Sarah Waters
5. Studio Grace: The Making of a Record by Eric Siblin

Reread:

1. Vengeance in Death
2. Holiday in Death
3. Midnight in Death

Sunday, August 30, 2015

Ivory

From The Ethicist:
When my mother passed away, I inherited an antique necklace made of carved ivory beads. I love the look of — and am sentimentally tied to — this necklace, but I am also a supporter of anti-poaching programs and organizations. I have avoided wearing the necklace because I don’t want to appear to support the ivory trade. On the other hand, I hate not being able to wear one of the few pieces of jewelry that I have from my mother. What should I do with the necklace?
 One thing that occurred to me while reading this: would people actually recognize it as ivory?

When I do a google image search for ivory necklaces, they look like plastic costume jewellery to me. I have no idea if they'd look non-plastic in person, but based on the image search I seriously doubt that I'd look at them and automatically think "Clearly, that must be made from dead elephant tusks!"

I have a few pieces of jewellery from my late grandmother, and one of the necklaces has a few white beads on it.  The only reason why I know for certain they aren't ivory is because my grandmother wasn't anywhere near wealthy enough be able to afford ivory, even as small beads in a necklace made of many other things, even if it were a special, one-time luxury. 

One of the lines of discussion in the column is whether wearing ivory jewellery promotes the notion of ivory as a glamorous luxury item that is beautiful and should be coveted. But I question whether anyone who isn't enough of an expert to already have their own well-established opinion on the matter would even recognize it as ivory.

And, if LW is asked about the composition or origin of the necklace, she could simply and truthfully respond by talking about how it was her mother's and has great sentimental value.

Friday, August 28, 2015

How working from home affects my subconscious

One side-effect of working from home is that my subconscious seems to be less active.  I don't notice the lack of subconscious activity itself, but when I have a now-unusual) high-interaction day, I notice that I'm predreaming a lot more as I wait to fall asleep.  And the content of the predreaming is most often directly related to the interaction of the day - I can hear the voices and cadences of the people I interacted with echoing in the background, like you would if you were nodding off in a crowded room.

I don't specifically remember the influence of the people I interacted with in my subconscious before I started working from home, but it's quite possible I didn't notice it because it was baseline. 

If asked to think about my dreams or predreams in isolation, I would never say that I feel they're not what they should be on a regular work-at-home day. But, nevertheless, they are far more vibrant on high-interaction days, with content directly related to the interactions of the day.

Teach me about the economics of ATMs

A tiny family-owned convenience store in my neighbourhood has a Royal Bank ATM in it. Since the bank account from which I most often withdraw cash is with Royal Bank, I sometimes pop into this store to use the ATM.

However, I never buy anything from this store, because it doesn't have anything I need that can't be obtained at a significantly better price elsewhere in the immediate neighbourhood.

Am I making this store money by using the Royal Bank ATM located inside it? Or am I costing them money?

My googling tells me that no-name ATMs - the one that you often find in bars and restaurants and charge exorbitant fees - make money for the business in which they are located.  But do bank-branded ATMs also do that?  Even though they don't charge a transaction fee if you're a customer of that bank?  Or do the banks charge businesses to host the ATMs on the grounds that the ATM might attract people who will then become paying customers?  (I haven't been able to google up anything suggesting that they do, but it sounds like the kind of thing a bank would come up with.  Nor have I been able to google up anything suggesting that businesses make money from hosting bank-branded ATMs)

If it costs this little convenience store money for me to use their ATM, I'll get off my lazy ass and walk a block to the actual bank branch.  But if it's revenue neutral, I want to use it sometimes because it's more convenient for me on some of the routes that I take for various errands. And if it actually generates revenue for them, maybe I should use it systematically, rather than that revenue going to the bank or to Shoppers Drug Mart.

Anyone have any insight about how this works?

Sunday, August 23, 2015

The time I failed to do surgery on a Sleeptracker watch

I've blogged before about my various successful attempts to repair various malfunctioning household objects, so it's only fair if I write about my failed attempt.

I'd been using a Sleeptracker watch for years without incident, when one day the battery died.  The little hole-in-the-wall jewellery store where I'd previously gotten the battery replaced had closed and I happened to be at the Bay store at Yonge & Bloor, so I decided to see if their jewellery department did watch batteries. They did, and appeared to change the battery successfully.

Unfortunately, after a couple of days' use, I determined that the watch wasn't beeping any more.  It was telling time properly, but not beeping when the alarm was supposed to go off. Which makes it useless, since the whole point of a Sleeptracker is to wake you up!

 I did some googling, and found some other people on the internet who had had the same problem with digital watches (although never Sleeptrackers specifically), including instructions that were supposed to fix the problem. (I can no longer google up the specific instructions I found.)  So I bought a tiny screwdriver and opened up the watch to follow the instructions.

I managed to open it up reasonably easily, followed the instructions, replaced the battery, but it didn't beep. And, to add insult to injury, I couldn't get the watched closed again.  The band kind of overlaps the piece on the back of the watcht hat needs to come off, and I just don't have the physical dexterity to get that piece back on and tucked under the band on both the top and bottom and get the screw-holes to line up so I can put the screws back in.

So I put all the parts in a ziploc bag, and ended up buying a new Sleeptracker watch on eBay. (They seem to have discontinued the watches and replaced them with an iphone app, which is useless to me given how often I throw everything in bed with me out of bed in my sleep!)

Now the battery of that new Sleeptracker is running low, and I'm worried about whether it will be a complete write-off too.

Friday, August 21, 2015

Why is defecting allowed?

With the news that some Cuban Pan Am athletes defected, I find myself wondering why defecting is allowed from the perspective of the receiving country.

The results of my googling talk about the fact that defecting is prohibited by the country of origin and measures that countries might take to prevent people from defecting away.  But they take for granted that the receiving country will be happy to welcome the defectors.

Does the receiving country always in fact welcome defectors?  If so, why?  Do they ever turn them away?  Or can people automatically get in by announcing that they're defecting?

The definition of "defecting" as opposed to "emigrating" is that the country of origin doesn't want to let you out.  So, given recent stinginess towards refugees in various parts of the world, maybe people who have to pay smugglers to get them across the border so they can claim refugee status should instead announce that they're defecting?

Thursday, August 20, 2015

A better approach to ethical objection by doctors

I've blogged before about the mystery of doctors who choose to practise in a certain field of medicine even though they morally object to an integral part of that field of medicine.  Surely they should have seen it coming that they'd be called upon to do the thing to which they morally object (in the case that inspired that blog post, prescribing contraception when working in a walk-in clinic) and surely they should have chosen a different field of medicine if they objected to this.

But with the eventual legalization of physician-assisted dying (as they seem to be calling it now) in the news, I see a situation where the doctors literally didn't sign up for this.  It's quite possible for someone to have become a doctor without having seen it coming that they could be called upon to deliberately end a life. 

So in the shower, I thought of a simple guideline that balances physicians' ethics, patients' rights, the "they should have seen it coming" factor, and the "they couldn't have seen it coming factor."

Doctors should be required to provide all procedures and services that were usual and customary in their field and their jurisdiction at the time when they begin practising.  However, doctors can be permitted to opt out of only those procedures or services introduced after they began practising. The time when they "began practising" can be defined as either the time when they began their medical training as a whole, when they began their training in that specialization, when they graduated, when they began (or completed) their internship or residency - whatever the medical profession considers the optimal point in time.

So if you became a general practitioner in 1951, you can opt out of prescribing birth control pills on moral grounds. If you became a general practitioner in 2015, you had fair warning that you'd be called upon to prescribe birth control pills, so if you'd find that prospect morally objectionable, you had plenty of time to plan your career in a different direction.

If you became a doctor in 2007, you can opt out of physician-assisted dying on moral grounds.  If you become a doctor in 2020, you'll have fair warning that you might be called upon to help people die in whatever specialties end up providing that service, so if you don't want to provide that service you can specialize in podiatry or obstetrics or something.

If a doctor changes specialization or changes jurisdictions, they're required to provide all the usual and customary procedures and services at the time of their transfer. The reasoning here is they have an opportunity to research what they're getting into and plan accordingly.

This will also make it easier for patients not to get stuck with doctors who won't provide the service they need.  Patients can simply look up the doctor in CPSO or their jurisdiction's equivalent, and see when they began practising.

This way, the proportion of doctors providing a potentially-controversial service or treatment will always increase and never decrease. The acceptance of services among doctors (and therefore their availability) should mirror the acceptance of services among society (and therefore demand).  After a transitional period, patients won't ever find themselves stuck with a doctor who is morally opposed to a usual and customary service or treatment in their field. But, at the same time, no doctor is required to provide any service or treatment that they didn't know they were getting into.

Tuesday, August 18, 2015

Excellent customer service from Rexall (but, unfortunately, subpar umbrellas)

I grabbed a black umbrella at Rexall because they were calling for rain the day we scattered my grandmother's ashes, and I thought my usual coloured umbrellas would be inappropriate for a cemetery. But when I went to open it at the cemetery, it refused to open and the handle just came off in my hand. Multiple people all tried to open it, and no one succeeded. Fortunately I was able to borrow an umbrella, or I would have gotten wet!

I took it back to Rexall (fortunately I still had the receipt and the label with the barcode, although the label was torn.)  The cashier tried to open it, had the same problem, and then promptly and cheerfully gave me a full refund, all while expressing concern that she hoped I hadn't gotten too wet.

Unfortunately, I've had problems with every umbrella I've ever bought from Rexall. They used to have these cheerful yellow ones that I adored, but they'd always break within just a couple of months - and not even from blowing inside out, just from opening and closing and being in my purse.  I've never once been satisfied with the quality and longevity of an umbrella I bought there.

However, I am very satisfied with their customer service, so I will send more of my non-umbrella business their way.


Sunday, August 16, 2015

Why is there acetaminophen in everything?

This summer cold has used up my stock of Nyquil, so I went to the store to buy more.  They've changed their product line since I last bought any, and now there are two kind: Nyquil Cold & Flu and Nyquil Sinus.  I wasn't sure which one was "normal" Nyquil, so I read the labels, and was surprised to see that there's acetaminophen in both.

There's probably always been acetaminophen in Nyquil, but I noticed it this time because some random internet person recently told me that the freakish dreams I had when I tried Tylenol Cold & Flu a while back might have been due to the acetaminophen.  So I looked at the similar cold medicines on the shelf, and all of them had acetaminophen, except for the Advil-branded medicines which had ibuprofen. I could not find one single decongestant that doesn't contain acetaminophen or ibuprofen.

Why do they do this?

Acetaminophen is a painkiller and fever reducer.  The vast majority of my colds don't come with pain or a fever and, when I do have aches and pain or a fever caused by a virus, I don't always want to suppress it so I can accurately monitor the evolution of my condition.

I use decongestants so I can stop sniffling long enough to fall asleep.  I want my nose to stop running, and I wouldn't say no to a sedative. Fevers and aches and pains don't prevent me from sleeping (in fact, they make me want to lie down and close my eyes), so the acetaminophen simply doesn't help.


This cold also brought me a productive cough, so I decided to take an expectorant to make it pass faster.  Last time I had a cough, I learned that the very effective but very disgusting Buckley's cough syrup comes in pill form, so I bought some called Buckley's Complete Plus Mucous Relief.  I took this out of my medicine cabinet, read the label, and discovered that not only does it contain acetaminophen too, it also contains a cough suppressant in addition to the expectorant!  (And a decongestant, but I don't object to that.)  It seems that the cough suppressant and the expectorant would work at cross-purposes, and, since I'm at home, I don't want to suppress the cough! I want to spend the day coughing my lungs up and be done with it rather than having it stretch out over days and weeks.

So I went to the drug store and looked at the cough medicines and, once again, all of the cough medicines that come in pill form had acetaminophen and a cough suppressant. (There was a liquid expectorant, but liquid cough medicines are disgusting so I really hope to avoid them.)


I don't understand why they do this.  I very, very rarely need acetaminophen at the same time as I need a decongestant or cough medicine, and if I do have a fever or aches and pains that I do want to treat, it's no effort to take a Tylenol in addition to my cold or cough medicine.  There are also people for whom acetaminophen is contraindicated. What is gained by shutting them out of the cold medicine market?

On top of that, there have been concerns recently about people inadvertently taking risky levels of acetaminophen. Surely an easy and unobjectionable first step would be to remove acetaminophen from medications whose primary purpose is not pain management or fever management!

Tuesday, August 11, 2015

itunes deleted all the music off my ipod

The ipod: a 4th generation ipod touch running iOS 6.0
The itunes: itunes 11.0
The computer: PC running Windows 7


I added some new music to my itunes library, then synced my ipod (like I've done many, many times before) to add that music to my ipod.  But, to my shock, itunes instead deleted all the music off my ipod!

I tried to sync it again, but this didn't add any music to it. And, unfortunately, the second attempt to sync overwrote the backup of my ipod (itunes only keeps one backup and there's no way to keep others!), so I couldn't just restore the backup.

I went through all the usual disconnecting, reconnecting, turning stuff on and off, pressing both buttons to reset the ipod, but I still couldn't convince my music to go back on my ipod.  At one point about 400 songs (out of over 8000) went on the ipod, but when I tried to add more they went away.

Some parts of the internet suggested that this was a copyright thing, and music that wasn't purchased through the itunes store was being deleted, but that wasn't the case here.  I've never purchased any music through itunes, and the 400 songs that did end up on my ipod were from a mixture of CDs and downloads.

This discussion thread had some people experiencing the same problem. Some people suggested that this was a known issue and if you went to the Apple store they'd fix it, but they never specified what the Apple store did to fix it. I was slightly reluctant to do that, because the Apple store would probably update my iOS, and whenever my iOS is updated something goes wrong with one of my apps.  Also, since my computer is a PC, the Apple store won't even look at it, even if part of the problem is in the itunes that is on my computer.  So all I could see them doing was restoring my ipod, thereby forcing me to upgrade my iOS, and sending me on my merry way.

During the course of my research, I learned about the Manually Manage Music option in itunes, which allows you to drag and drop music onto your ipod rather than using the sync function. I tried to put my playlist on the ipod using this function, and it transferred about 500 songs (again with no discernible pattern) which is better than before but still only a small fraction of my 8000+ playlist.

Someone on the internet who was having the same problem mentioned transferring their songs album by album, so I decided to try to transfer the songs from an album that didn't get transferred.  I typed the name of the album into the search box in itunes, selected all, and dragged them over to the ipod.  It sat on the first step of the process (something like "preparing for transfer" - I'm not about to try it again just to get the exact name of the step!) for a really long time, leading me to believe it wasn't going to work.  Resigned, I contemplated whether to update and restore the iOS, update or reinstall itunes, go to the Apple store, or any number of time-consuming and no doubt fruitless steps that happen next in the troubleshooting process.

At some point during this contemplation, I idly backspaced the album name that I'd typed in the search box of itunes, so itunes once again displayed all the songs.  And, at some point during this contemplation, itunes started transferring the remaining ~7500 songs to my ipod!!

I have no idea if backspacing the album name out of the search box is what caused all the songs to transfer.  I have no idea if it was caused by something else I failed to notice.  But I haven't been that happy since the day my computer finally came back from the Dell depot!

So now I'm keeping my ipod in Manually Manage Music mode, so I don't have to do a full sync the next time I want to put more music on it. If I have trouble again, I'll try transferring just a few songs isolated by searching in itunes, then backspace the search out of the search box.  (I have no idea if that will work, but that's where I am in my testing.) However, I'm dreading what will happen when I have to sync the ipod again to update apps or something.

Friday, August 07, 2015

The time I did surgery on a remote control

A few days ago, I turned on my TV, pressed the AV button on the remote to switch it to my Wii, and discovered that the AV button didn't work.  I pressed some other buttons, and they didn't work either.  I replaced the batteries, and they didn't fix the problem.

My cable remote could be convinced to control everything on the TV except switching it to AV, so I figured I'd have to either try a universal remote (might not work, since my TV is not a common brand) or maybe even buy a new TV.

Since the remote was dead anyway, I decided to see if I could take it apart to find out why it wasn't working. I saw some little screws, so I unscrewed them.  Then I pulled the casing apart, and could clearly see how the inside worked.  There was a rubber layer that constituted the buttons, with a small dot of what I assume is conductive material corresponding with the appropriate point on the conveniently-labelled circuit board.  I couldn't see any flaws or signs of wear and tear, so I sprayed compressed air at everything and put it back together, noting with interest that the various parts seem to be deliberately shaped in a way that makes it impossible to put it back together incorrectly. 

I screwed the screws back in, put the batteries back in...and it worked!!!

I have no idea why it worked - I didn't do anything to cause it to worked - but nevertheless I took it apart, put it back together, and it worked.

That's one impossible thing before breakfast!

***

I've blogged before about positive physical changes that correlate with getting older.  I think I now have a positive mental change that correlates with getting older: better ability to take things apart and put them back together.

I've blogged about my experiences with a chair and a lamp.  In both cases, and in the case of the remote control, I don't think I could have done it in my 20s without clearly-illustrated written instructions.  My brain just didn't see how things worked the same way.

I have no idea why this is. I've never done anything to work on it deliberately.  There's nothing in my day-to-day life that should improve my ability to take things apart and figure out how they work.

Understand, I'm still not objectively good at taking things apart or figuring out how things work.  I'm still very much hindered by my clumsiness and poor physical skills of all types. But I do seem to be better at it than I was before, and I do seem to be improving, for reasons I cannot fathom.

Monday, August 03, 2015

I promise one day I'll blog about something other than pants

One more factoid from my exploration of young people's attitudes towards pants that I forgot to mention in my previous blog post on the subject:

It seems that young women who wear high-waisted pants don't consider the high waists to be suitable for when they're trying to dress like a grownup (job interview, internships, entry-level office jobs, etc.)  Many discussions by young people of dressing for the adult world specifically mentioned high-waisted pants on the "what not to wear" list, and no one ever argued against that point.

This surprises me because when I was a teen, it never would have occurred to me that a more youthful silhouette would be inappropriate for dressing like an adult.  If my pants were dress pants and my blouse was a blouse, I considered that sufficiently grown-up, even if the pants were hip-hugging flares and the blouse was fitted.  I made sure the individual pieces met "grown-up clothes" criteria and I made sure nothing was exposed (i.e. there wasn't a gap of midriff showing between my hip-hugging flares and my fitted blouse), but it never would have crossed my mind to wear a higher waistline, a baggier blouse, or tuck and belt.

Similarly, if a student came into our office wearing high-waisted pants tucked and belted, it would never occur to me to think that this outfit was less grownup or less professional than what I'm wearing.  I'd recognize it as a current style, something I myself wouldn't wear, but there's nothing inherently wrong or Less Than about it. Even if it were unflattering on her, I wouldn't see that as less professional or less adult. I'd just see her as someone whose fashion skills haven't yet fully developed, which has no bearing on her professional competence.

Added to that, some of the older female employees in my office are from the era that tucked in their shirts previously. Some of them evolved away from that as fashion trends changed, but I can think of one or two individuals whom I've seen in mom jeans.  For those of us who don't wear them, it's most often because we see it as too old, not because we see it as too young.

It's so interesting how something so innocuous can have such different connotations in the eyes of people with not even a generation's age gap!

Friday, July 31, 2015

Books read in July 2015

New:

1. The Alzheimer's Diary by Joan Sutton
2. The Rosie Effect by Graeme Simsion
3. Happiness by Design by Paul Dolan
4. The End of Memory by Jay Ingram

Reread:

1. Calculated in Death
2. Ceremony in Death

Thursday, July 30, 2015

Things They Should Invent: sitting pants

I recently read an article about a designer who's making clothes for people in wheelchairs:
The mainstream clothing that we buy is cut and drafted for standing. It’s something we don’t think twice about. When we sit down our clothes get all mess up. What I mean by that is that with pants they cut you in your gut and they ride down at the back; Or with a long coat, it will get all bunchy at the front.
Until I discovered Reitman's Comfort Fit, literally every single pair of pants I'd ever worn cut into my gut and rode down in the back when I sat down.  And, since I've spent the vast majority of my life in a classroom or at a computer, that meant my pants were uncomfortable the vast majority of the time.

I'm sure I'm not the only one who spends far more time sitting than standing, especially when at work or in school.  I don't know what sitting pants would look like on a person who's standing up, but some people may well be willing to make the sacrifice.  Added to that, there are situations in which people have to look good sitting down but standing up is less relevant (talk show guests come to mind, and I'm sure there are others).

I'll bet this business and others like it could expand their client base and probably earn higher margins by making sitting pants for non-disabled people who simply spend a lot of time sitting.

Sunday, July 26, 2015

Pants

I had a teacher in high school who always wore high-waisted, pleated pants, and I thought they made her look frumpy. This teacher was the kind of person who would otherwise have come across as youthful and with-it - she was under 40, savvy, up on her students' pop culture, able to discern who had a crush on whom, meticulous with her hair and makeup - but I thought these pants were so aging and out of touch.

It occurs to me that teenagers today probably think the exact same thing about me and my boot-cut pants.

I have noticed recently that when I see boot-cut, flared or wide-legged pants being worn in media from 10-15 years ago, it seems a bit out of place.  But, nevertheless, I feel badass in my own boot-cut pants, and frumpy in skinnies.  So I keep wearing what makes me feel badass, even if Kids Today might be laughing at it. Flares are scheduled to come back in a few years anyway.

***

I blogged previously about the recent trend among young women of wearing high-waisted pants with shirts tucked in.   I recently found out why they do this: they believe it's slimming because the well-defined waist emphasizes how small their natural waist is.  This flabbergasted me because, with my fashion awareness having happened just as the last shirt-tucking trend waned and shirt untucking (with narrow shirts) came into fashion, I think an untucked shirt is more slimming because it creates a smoother line, and a tucked shirt is less slimming because it creates a sausage effect.  In one of my journeys down an internet rabbit hole, I landed in a fashion forum populated by young women where people posted comparison pictures to prove that high waists and tucking and belting was more slimming, and I genuinely felt that these pictures demonstrated beyond any doubt that an untucked shirt was more slimming.  We're looking at the exact same thing and seeing it as a complete opposite!

I'm not going to link to the examples I saw, because it isn't appropriate to send my adult readership to look at pictures of teens and scrutinize their figures with the general message of "See how these kids think they look slim but they really look lumpen!"  So, instead, I'm going to show you two pictures of actress Angie Dickinson from the 1950s:

Angie Dickinson (right) in a belted bodysuit
Angie Dickinson in a non-belted bodysuit























I think the outfit on the left is less flattering, specifically because of the belt. To my eyes, the belt creates a sausage effect with the soft part of her belly above and below, making her tummy below look sticky-outy, and the fleshy bit above look like a roll of fat.  Obviously this effect is very minimal on Ms. Dickinson - it's far more pronounced on a person with a more average figure - but you can see the hint of it here.  In contrast, I think the outfit on the right is more flattering because it creates a smoother line without any bulges of flesh.

However, people who choose high waists and tucked-in shirts see the picture on the left as more flattering, because the belt is cinched tightly around her waist, showing just how small her waist can be made to go. They'd see the picture on the right as less flattering, because it doesn't necessarily demonstrate the minimum possible circumference of her waist.

This isn't just an evolution of fashion trends, it's a complete change in what different people perceive when looking at the exact same thing!  It will be interesting to see how the fashion choices of the belt = thinner contingent evolve as trends change and, eventually, a high waist and tucked-in shirt once again become signs of frumpiness.  I've blogged before about differing generational perceptions of pants length. Maybe in a decade or two, we'll also have differing generational perceptions of waistlines.

Saturday, July 25, 2015

Idea density

I've been reading about the famous Nun Study of Alzheimer's disease, and specifically about its findings relating to idea density.

As part of the study, they analyzed essays that the nuns wrote when they were in their early 20s, and found that nuns who didn't get Alzheimer's had higher idea density in their essays, and nuns who did get Alzheimer's had lower idea density.

An example of a sentence with high idea density, taken from this article:

"After I finished the eighth grade in 1921 I desired to become an aspirant at Mankato but I myself did not have the courage to ask the permission of my parents so Sister Agreda did it in my stead and they readily gave their consent."

An example of a sentence with low idea density:

"After I left school, I worked in the post-office."

Interpretations of this finding tend to view high idea density as equivalent to better language skills. But when I read about this, my first thought was that some nuns may have been able to write a more idea-dense essay, but chose not to.  They may have thought a simpler style more appropriate to the purpose of this essay.  They may not have enormous colour to add to this one particular subject.  Maybe they didn't have as much time as they would have liked.  Maybe the pen they were using was uncomfortable to write with. It's possible that their writing style may even have matured away from frills - I know when I was younger, I went through a phase of writing ridiculously (yes, even more ridiculously than now) in an attempt to emulate of the Victorian authors I was reading at the time.

Even if we accept the assumption that high idea density equals better language skills (I'm reminded of the much-attributed "Please excuse the long letter, I didn't have time to write a short one."), we have no way of knowing how many of the subjects had higher linguistic ability but chose not to use it to its fullest extent for that particular essay.  What if the true predictor of Alzheimer's is instead whatever process leads the subject to assess that particular essay assignment as more conducive to a simpler writing style?

It would also be interesting to see if the idea density correlation persists over generations.  The various examples of high idea-density sentences I've read seem old-fashioned to me (probably reflecting the fact that the nuns wrote them in the first half of the 20th century), while the low idea-density sentences seemed more timeless. 

Actually, it might also be a function of the specific education the subject received. Someone being trained in writing today would be guided away from certain stylistic elements in the high idea density example given above (although not necessarily from idea density itself), and these elements seemed fairly common in the various examples of high idea density sentences I've read from this study.

It would also be interesting to see if the idea density pattern holds up in other languages.  In my French writing classes, I was nudged towards a higher idea density than I'd land on naturally, although I never find myself wishing for lower idea density as I translate French to English. Other languages might gravitate to lower or higher density for syntactic or cultural reasons, which might change the correlations with Alzheimer's.

Wednesday, July 22, 2015

Ideas for the Damage Control LW who wants to avoid disclosing her surgery

I didn't notice this Damage Control column when it first came out, but I have an idea for the letter-writer:

I am in my early 30s. As a teenager, I was quite obese (300 lbs), but I am very grateful to say that I have been slim now for several years. But my body still “bears the wounds” of my previous weight: lots of loose skin, a sagging chest, etc. Special garments were needed to hold it in. I recently underwent the first of two surgeries to correct my loose skin, a procedure called a body lift. I took a month off work, and was paid through the company’s short-term disability plan. Though I did say it would be the first of two surgeries, I did not tell people at work the exact nature of my surgery: I think there is a stigma attached to cosmetic procedures. I did get the odd “soft inquiry,” but kept mum. My dilemma is that my second surgery involves a lift and augmentation of both my bum and breast area. How do I handle telling my boss and co-workers without revealing too much or coming off as cold and closed off? Also, how do I respond should I get comments about my new appearance? While I fear negative judgment about being “paid to get a boob job,” this is a private issue that has a long history.
 The columnist recommends sending a mass email with an explanation, which I strongly disagree with.  Sending this email might give the impression that your private parts are open for discussion, not to mention that people could feel sexually harassed if they started receiving unsolicited emails about their colleagues' private parts. It might be appropriate in certain cases to tell the whole story to one or two selected colleagues one on one, if the subject should come up in an informal private conversation (depending greatly on context and the nature of the specific interpersonal relationship).  It's even plausible that in some offices, with some specific combinations of interpersonal relationships, it might even be appropriate to tell everyone at happy hour or around the water cooler.  But this isn't a subject for an email. I never thought I'd say this, but email's too formal.  If you disclose, it should be in an informal context that's marked as outside the scope of Business.

But as I read the letter, an idea occurred to me for a two-tiered sneaky approach that could be used if she chooses not to disclose the actual nature of her surgery.

Basic sneaky approach: come up with a cover story about how recovering from an unnamed surgery might cause your body to change shape.  For example, "My doctor recommended Pilates to help me regain strength in various areas after my surgery." True but secondary would be best (e.g. if you don't eat as much when recovering from surgery), but false but plausible would work too. Think about this very carefully, so it's consistent with any observable changes to your body after your first surgery and the changes that will be observed in your body after your second surgery, and script a delivery that will allow you to segue away from the topic of your surgery onto other topics (e.g. "But I've only ever tried Pilates mat work, I've never been to a gym where they use those machines. Does your Pilates class use the machines?  What's it like?")  Once you've worked this out, drop it into conversation with a co-worker or two next time the opportunity arises.  Then, if the rumour mill starts discussing the change in your body shape, an explanation straight from the horse's mouth is readily available.

Advanced sneaky approach: do some research and find another kind of surgery that could plausibly require two procedures and have similar recovery time and other observable effects to what your colleagues can observe about you.  I don't have the medical knowledge to come up with a real example, so I'll use a fake example: boneitis.  Suppose you do some research and discover that boneitis surgery can require two procedures, at similar spacing, with similar recovery times, and can result in a similar change in physical appearance.  Now, if you're ever in a situation where people are inquiring about your medical situation, you can hint in the direction of boneitis.  Don't explicitly say you have boneitis! Instead, present as someone who has boneitis and wants to be discreet about it, so that if people google the hints you're dropping, boneitis will come up near the top of the results.

Obviously this is a bit complex, and whether you actually want to do it will depend on how secret you want to keep your medical situation and your own capacity of subterfuge. But it is an option if you're finding it difficult to say nothing, but don't want to actively disclose.

Saturday, July 11, 2015

Unnecessary TTC announcements

I was riding a subway northbound on the Yonge line. As we travelled from Davisville to Eglinton, the driver made an announcement: "Attention all passengers: we are currently bypassing Spadina station in both directions on both lines due to a police investigation."

Problem: We were heading north from Davisville to Eglinton, which is directly away from Spadina station.  To be affected by delays at Spadina station, a passenger on our train would have to get out, board a train heading in the opposite direction, and travel quite a few stations.  It's highly unlikely that anyone would do this!

Back in my commuting days, I've been on trains where this happened quite a few times - the driver announces a delay that's behind us, or heading in the opposite direction, and therefore is not going to affect our train at all and isn't going to affect any of the passengers unless they get out and switch to a train heading in the opposite direction.  These aren't system-wide loudspeaker announcements, like you hear made by a pre-recorded voice when waiting on the platform.  These are announcements made specifically by the driver of our one train.

I don't think they should make these announcements. 

One thing I've noticed since I started following @TTCNotices on Twitter is that the vast majority of delays are cleared very quickly, often within just a couple of minutes. I also learned, back in my commuting days, that even delays for which shuttle buses are called are often cleared so quickly that it's better to wait them out than to get on a shuttle bus.

So I think having drivers make a specific effort to announce delays that don't apply to the train will just make passengers unnecessarily worry and stress and think the system is unreliable.  This is exacerbated by the fact that the audio quality of driver announcements is not as good as that of recorded announcements, so it produces some unnecessary "Wait, what did he say?" moments.

If any passengers are going to be affected by the delay in the opposite direction or behind us, they'll have plenty of time to find out when they're waiting on the platform for their opposite-direction trip, or when they look at one of the video screens on the platforms, or when they check Twitter.

But I think nothing is gained by having drivers make an announcement just within their train when the announcement definitely does not apply to that train.

Sunday, July 05, 2015

Things They Should Invent: emails that self-delete once they've expired

1. Sometimes I get coupons in my email that are only good for a limited time or on a certain day, or if I spend a certain amount of money.  Often when I get them, I don't immediately know whether I might be making any purchases that qualify. But then, a few days later, I realize that I might possibly have a coupon that matches a purchase I'm going to be making.  So I search for my email, but the results turn up hundreds of coupons (since I never delete anything) and I don't know which ones are still valid and have to open more emails than should be necessary to determine whether there is in fact a relevant coupon.

2. Sometimes I email someone and get an out-of-office message.  If I'm subsequently sending group emails that are only relevant immediately (as opposed to something they'll need when they get back from their absence), I like to leave the people who are absent off the email list.  The easiest way to see who is absent is to visually scan my inbox and see whom I've received Out of Office messages from. (They have a distinctive icon in Outlook, so I can tell at a glance.)  But, again, the problem is that I can't tell at a glance if the out-of-office is still in effect, so I have to open more out-of-office messages than necessary to determine who I should leave off the group emails.

Proposed solution: optional expiry dates in emails.  If the content of the email is expired, the email is deleted from the recipient's inbox.  This would apply automatically to out-of-office messages - once the recipient is back in office, the out-of-office message is deleted from the recipient's inbox.

Variation: recipients can have the option to turn off the self-deleting (either as for all emails in their inbox or on an email-by-email basis), but the default setting is self-deletion once the email expires.


Or if that's just too much to handle, a technological solution for the out-of-office problem would be for automatically-generated out-of-office messages to state the return date in the subject line. 

Wednesday, July 01, 2015

Why young LCBO workers still card me?

One of the things we did in my Sociolinguistics class in university was analyze print advertisements.  While analyzing an ad for some kind of beauty product, the prof asked us who the target audience is.

"Women at the age where they are just starting to see fine lines on their face," said one of my classmates. 

"And what age is that?" asked the prof?

"Late 20s," said one of my older classmates.  The other older classmates and the prof all nodded and murmured assent.

I was rather surprised that the beginnings of wrinkles would turn up while you're still in your 20s, but, being only 19 years old myself at the time, I had no actual frame of reference.

My own fine lines began showing right on schedule,  at the age of 27.  And, since I became aware of them, I also began noticing the presence or absence of lines on other people's faces.  I must have seen people with fine lines before, probably including those of my classmates in that Sociolinguistics class who could attest expertly to when fine lines start making their appearance, but it was never a factor that I took specific note of when processing a face as a whole.

It occurs to me that this might be the answer to the mystery of why younger LCBO workers keep carding me when older workers stopped long ago!

If the younger LCBO workers are like my younger self, they might not notice my fine lines as evidence that I'm no teenager.  But they'd be more likely to notice my acne since they've most likely been through acne themselves.

Similarly, people who haven't started greying yet might not notice my few individual grey hairs (I didn't notice random strands of grey on people who were anything less than salt-and-pepper before I started greying myself), but the fact that my hair is long, which is culturally marked as youthful, is readily apparent to anyone of any age.

Tuesday, June 30, 2015

Books read in June 2015

New:

1. The Housekeeper's Tale: The Women Who Really Ran the English Country House by Tessa Boase
2. Naked Came the Phoenix (serial novel) by Barr, Robb, Pickard, Scottoline, O'Shaughnessy, Jance, Kellerman, Clark, Talley, Perry, Gabaldon, McDermid and King
3. An Unnecessary Woman by Rabih Alameddine

Reread:

1. Immortal in Death
2. Rapture in Death

Sunday, June 28, 2015

Epsom salts are the solution to blisters!


Content warning: this post contains graphic, yucky descriptions of blisters and feet. tl;dr: if you have blisters on your feet, soak them in water with epsom salts

Last week, I wore my awesome brown sandals for the first time this season. Even though I've been wearing them comfortably for years, this time around they somehow managed to give me an enormous blister on the bottom of my foot, right where the heel meets the arch of the foot. I was a fair distance from home when I realized I was developing a blister, so I had to walk for another half an hour before I could take the shoes off and treat the blister.  By this time, the blister had grown to about three finger-widths in diameter.

This was, clearly, a problem. I didn't want to burst the blister because then the outer layer of skin would peel off and I'd have an open wound on the bottom of my foot.  (Not the most hygienic place for an open wound!) But if I put a shoe on my foot, the blister would burst by itself from being compressed between my foot and the shoe.  I didn't have a bandage or dressing big enough to cover it and didn't much fancy walking to the drugstore on my blistery foot, so I started googling for home remedies for blisters in the hope of finding something I could do to shrink it with what I had on hand.

The only thing I google up that I had in the house was epsom salts. I highly doubted that would work, but soaking my feet sounded nice anyway.  So I soaked my feet in hot water with epsom salts and a drop of iodine, and discovered that the blister was sticking out far less, as though some of the water had drained from it.  However, I didn't feel any stinging when my feet were in the water, so I was pretty sure it hadn't broken open.

Then I went to bed, and slept for 11 hours (I usually sleep 9-10 hours even on non-alarm mornings).  When I woke up, I discovered that the blister was completely empty of water!  However, it hadn't been punctured - the water had either dried up internally or been reabsorbed into my body.  The outer layer of skin was still dead and it seemed like there was still an open wound underneath, but the outer layer was completely stuck to the wound, serving as a very effective moist dressing - which is a bonus since I don't have the materials to make a moist dressing here at home!

My foot stayed like that for a week - the blister didn't fill back up, there was no sign of contamination or infection, it just looked funny - and then one day it became really, really itchy.  I tried to avoid scratching it because I didn't want to damage or contaminate it, but eventually I couldn't resist and scratched it.  The gross dead outer layer of skin came off....revealing fresh, pink new skin underneath, and no hint of open wound!

I've never before had a large blister heal to completion so quickly, and this was by far the largest blister I've ever had!  Next time I get a blister, I'm going straight to soaking it in epsom salts before I even try anything else.

Saturday, June 27, 2015

Taking for granted achieved!

With yesterday's legalization of same-sex marriage nation-wide in the US (congratulations, by the way!), I was surprised to see a few people on Twitter suggesting that same-sex marriages had been legalized easily and without any fuss in Canada. 

At first I was shocked that anyone could forget, but then I realized that same-sex marriage was legalized in Ontario 12 years ago.  There are grown-ass adults who would be legitimately unaware of the struggle to get it legalized for the simple reason that they were children when it happened!

Five years ago, I wrote:
One day, in a couple of decades, we will be celebrating the 20th or 25th anniversary of the legalization of same-sex marriage. I will be in my late 40s, with lines on my face like my father's and salt-and-pepper hair dyed chestnut like my mother's, wearing no-line bifocals as though that little line is the only thing that betrays my age. My co-workers and I (for in my imagined future I'm still in the same workplace with the same co-workers) will sit around the break room reminiscing. Where were you when you first heard? Who was the first same-sex married couple you knew? When was your first big gay wedding? Newspapers will tell the story of how this all came about, track down the court justices and the Michaels and do "Where are they now?" profiles. And in our office will be some new hires, kids in their early 20s just out of university, who will look at all this fuss we're making and feel nothing, because for them it will be something that has always been there.
 I'm in my mid-30s, with the lines on my face just beginning to form and enough salt in my pepper that I'm aware of it but not enough that I'm dyeing it. My glasses are still monofocals.  I'm not chitchatting with my co-workers in the break room because I work at home, and I still haven't had the opportunity to attend a big gay wedding.  But already, 10 years earlier than I estimated, there are people who are unaware of the fuss and feel that same-sex marriage has always been there!

Happy Pride, everyone!

Tuesday, June 23, 2015

All change is not created equal.

My various investigations into resilience tend to talk about change an awful lot, often framing people as either embracing change or being change-averse, and talking about how to become more open to change.

And, analyzing my own life, I realized that this is a huge fallacy.  Change is not a monolith.  I (and, I assume, others) embrace change when it's a good change, but want to avoid it when it's a bad change.

For example, I was (and still am) absolutely thrilled about being given the opportunity to work from home rather than going into the office very day.  But that's not because I like change per se, that's because working from home is in all ways superior to working in the office.

And I was stressed like crazy about having to do without my computer when it was being repaired.  But that's not because I dislike change per se, that's because not having a computer is in all ways inferior to having a computer.


I find I am more resistant to change in many areas as life goes on, but that's not because I'm growing to dislike change in my old age. That's because I've been able to figure out how to make more and more areas of my life optimal, so change would make them worse, whereas before I was able to make those areas of life optimal, change would simply make them different.

For example, when I lived in one of the many 1970s highrises in my neighbourhood, with no dishwasher and the laundry in the basement and a small silverfish invasion every spring and fall, I wouldn't have been disappointed if I'd had to change apartments, because there was clear room for improvement and many comparable buildings (with room for better to exist). But then when I moved to my current apartment, which was brand new when I moved in and had all the appliances, much better management and construction, and averaged only one bug a year, I would have been distraught about having to move because there wasn't, to my knowledge or within the reach of my research, anything comparable in existence. (Now there is, but there wasn't for several years after I moved in.)

This has nothing to do with my attitudes about change itself, but rather with the fact that leaving good housing for mediocre housing is different from leaving mediocre housing for other mediocre housing.


My inner conspiracy theorist wonders if this "openness to change" thing is a conspiracy. I'm sure most people welcome change when it's an improvement and dread it when it makes things worse.  But by presenting "openness to change" as a virtue, perhaps the powers that be are trying to shame or embarrass people into speaking up against changes that will make our lives worse?

Wednesday, June 17, 2015

Things They Should Study: does the success or failure of clothing retailers correlate with specific fashion trends?

A few months ago, they closed the Smart Set in my neighbourhood.  I was disappointed, because some of my very favourite shirts have come from Smart Set.

But, at the same time, I haven't bought anything from them in years.  They discontinued the specific style of shirts that's my very favourite, and, for the past couple of years, haven't had anything in colours that are flattering on me.

This came to mind when I saw that Gap is closing 25% of its North American stores.  Again, some of my favourite pieces are from Gap, but at the same time I haven't bought anything from them in years because they haven't had styles and colours that are flattering on me.

In general, the trends of the past few years have been unflattering on me, so I haven't bought nearly as many clothes as I did in previous years.  I don't feel enthusiastic about anything I see in stores, I don't feel moved to stock up on anything, and I keep reading about how clothing retail is dying.

It would be interesting to study this on a broader level and see if there is a correlation between specific fashion trends and the success or failure of clothing retail businesses.  You'd have to control for overall economic conditions, which should be fairly straightforwards (is clothing retail growing/shrinking faster than the overall economy?) You might also be able to control for other factors (such as the growth of online shopping) by comparing men's and women's clothing retail. Trends aren't the same for both genders, so if, for example, women's retail slows down significantly compared to men's when baggy white shirts are in style for women, then we'd have evidence suggesting that baggy white shirts are bad for women's clothing retail.

Friday, June 12, 2015

Things They Should Invent: train PSWs in feminine facial hair removal techniques

A while back, I came up with the idea that nursing homes should provide free esthetics services so female patients don't have to deal with the indignity of facial hair.

Today, my shower gave me a far simpler idea: PSWs should be trained in hair removal methods that are appropriate for women's facial hair.


By general societal standards, removing facial hair is seen as more optional for men than for women. PSWs are trained in the more-optional removal of men's facial hair, so they should also be trained in the more-mandatory removal of women's facial hair.

As we know from our own firsthand experiences, tweezing out your yucky chin hairs is more of an everyday personal grooming thing that you do in your own bathroom rather than a specialized beauty treatment for which you go to a beautician.  Therefore, it should be treated as such and be part of the patient's everyday personal care done by their PSWs.  (Yes, beauticians do provide more hardcore facial hair removal services.  Barbers will also shave clients if asked, but male patients get shaved by PSWs rather than having to pay to go down to the hairdresser.)

Some will argue that PSWs are already trained in shaving and that's a hair removal method.  But it's not the a correct, appropriate, suitable method for women's facial hair. Shaving results in same-day regrowth and stubble (especially on hairier-than-average people - and any woman with facial hair is hairier than average), which means that the socially-inappropriate facial hair problem will return before the end of the day.  Removing the hair at the root means the removal will last several days and grow back more gently and less visibly, allowing the patient to retain her dignity for longer.

And that's what this really is - a question of dignity.  Tweezing or threading or otherwise removing the hair at the root spares female patients the indignity of facial hair and the indignity of suffering through the masculine-marked process of having their face shaved. PSWs are trained to retain as much as patients' dignity as possible when bathing them, dressing them, toileting them, feeding them, moving them - every single area of daily life.  This should include the removal of unsightly facial hair.

Thursday, June 11, 2015

Things They Should Invent: cellular network detection device

My cellphone uses both the Rogers and Fido networks. The other day I was involved in a long texting conversation while walking around the neighbourhood doing my errands, and I noticed that in certain places I got the Rogers network but not the Fido network, and in other places I got the Fido network but not the Rogers network.

This makes me think that it might be possible for there to be certain dead zones for a particular cell phone provider even within an area that's supposed to get service from them.  Which could be an annoyance if you switch providers only to find that you can't get service in your apartment or in your office.

Proposed solution: some kind of a device that can tell you which cellular networks can be picked up in a particular place.  You carry it around, it detects networks, and it tells you which networks it detects.

These devices could be rented out by cellphone retailers for a reasonable price per day. I'm sure potential customers would be quite happy to pay a reasonable amount to confirm that a signal is available right where they need it, and I'm sure cellphone providers who try to compete on signal quality would be happy to empower potential customers to confirm the quality of their signal.

Currently, if you look on cellphone providers' websites to see where their signal is available, they give a rough geographical map. Since I live in the geographical centre of Toronto, all providers claim to provide service in my neighbourhood.  Nevertheless, there are pockets where the Rogers signal can't reach, and pockets where the Fido signal can't reach, which suggests that there may well be pockets where other signals can't reach.  Cell providers can't reasonably be expected to provide a map of all these pockets, but surely they could provide us with a device that would let us detect them ourselves.

Maybe someone could even make an app that would do this?

Tuesday, June 02, 2015

Could working-class women dress themselves when upper-class women couldn't?

At certain points in Western history, aristocratic women didn't dress themselves.  They had their maids help them.  Based on what I've absorbed from the ether, they weren't necessarily able to dress themselves either, because of the design and complexity of the clothes.

For example, there's a scene in Downton where Lady Mary is going away for a weekend tryst, and she and Anna are looking through her wardrobe making sure that everything she packs is something she can put on all by herself (implying that she can't dress herself in all her clothes independently).  And this is in the 1920s when clothes were easier - in the Edwardian and Victorian eras, with corsets and crinolines and everything, it would be even more difficult to dress oneself.

I also recently read a book that mentioned that Edwardian upper-class ladies would wear tea gowns in the afternoons because that's when they met with their lovers, and tea gowns were something that a lady could put back on herself (implying that she's not able to put on her other styles of dresses herself).

This makes me wonder about the situation for working-class women.  Even if their dresses are more practical, the maids on Downton still have corsets and petticoats before the 1920s.  (In fact, there was a brief period where the aristocrats were wearing the newer, more comfortable uncorseted dresses, but the maids - who had to do actual physical labour - were still in the old corseted dresses!)  Could they dress themselves, or did they have to help each other dress?  What about Daisy, who woke up before anyone else in the house?  What about Mrs. Hughes and Mrs. Patmore?  Did one of their subordinates see them in their underthings every morning?  What if a working-class woman lived alone?  If a household consisted of just husband and wife, did he have to learn how to do up a corset?

Sunday, May 31, 2015

Books read in May 2015

New:

1. What If?: Serious Scientific Answers to Absurd Hypothetical Questions by Randall Munroe
2. The Rosie Project by Graeme Simsion 
3. This is Improbable Too by Marc Abrahams
4. The Brain's Way of Healing by Norman Doidge
5. The Myth of Alzheimer's by Peter J. White house with Daniel George 
6. The Spirit Catches You and You Fall Down: A Hmong Child, Her American Doctors, and the Collision of Two Cultures by Anne Fadiman 
7. Gilgamesh (Stephen Mitchell version)

Reread:

1. Naked in Death
2. Glory in Death 

Teach me about the connotations of Orange County, California in the 1980s

When I was in elementary school (between 1985 and 1991),  this story-teller sort of guy came to our school and told us some stories.  When it came time to tell us the last story, he said we could choose between two: one was about a boy and his pond, and the other was about a big-city thief.  His tone and delivery suggested that the boy and his pond story was idyllic (and, by extension, boring) and the big-city thief story was exciting. My schoolmates overwhelmingly voted for the story about the thief, so he told us that story.

Afterwards, there was Q&A session, and someone asked him if anyone actually asked to hear the story of the boy and his pond, and he replied that it had happened once, in Orange County, California. His tone and delivery suggested that if you knew anything about Orange County, California, you'd understand why this was and perhaps find it humorous.

Of course, as an elementary school student in southern Ontario, I didn't know anything about Orange County, California.  In fact, I still don't.  This memory came back to me in the shower this morning so I've been doing some googling, and I still can't figure out any characteristics of Orange County that would make it clear why students there in the 1980s would prefer to hear a story about a boy and his pond. 

Anyone have any insight?

Saturday, May 30, 2015

The international exit sign




I first saw this kind of sign in Frankfurt Airport in 1998.  I had just gotten off long flight and badly had to pee.  In my condition, I thought the sign pointed to the washrooms.  After all, where else would a person possibly want to run to?  I followed the arrows, running nearly as fast as the figure in the sign, and eventually found some washrooms, to my great relief! It wasn't until several days later, when I saw the sign in a context where it was clearly pointing to the exit and not the washrooms, that I realized what it meant.

They recently installed this sign in my apartment building, replacing the red EXIT signs that are more commonly used in North America.

And every time I see it, I feel like I have to pee.

Friday, May 29, 2015

Whistling

Whistling is hard, at least compared with other ways of producing potentially-musical noise such as humming or singing or just opening your mouth and vocalizing. It takes more skill and experience and precision to produce the intended note than it does with humming or singing.

Whistling is also non-intuitive compared with other ways of producing potentially-musical noise.  When you see a pre-verbal baby vocalizing, you can see how a person might stumble upon humming or singing, but you can't see whistling just happening by accident.

And yet somehow, someone in human history figured out how to whistle.  And thought it was worth the trouble as opposed to humming or singing.  And, somehow, the idea caught on and now it's something that everyone is at least aware of if not capable of doing. (Unless it's cultural?  A quick google for whether there are any cultures that don't have whistling only turns up cultures where whistling plays a key role.)

And not only does whistling persist on a macro level, it also persists on an individual level.  There are people who, when they have a tune in their head that they want to express, opt to whistle it out instead of humming or singing or going "dodo dodo dooo".

I can't fathom why whistling is so normalized or why a person would opt to whistle rather than hum their current earworm, but it is an interesting cultural phenomenon.

Thursday, May 28, 2015

The mystery of the Yonge Eglinton haters

The "density creeps" who have been in the news lately remind me of one of the mysteries of Yonge & Eglinton: people who deliberately move here and then complain that the neighbourhood has characteristics that it has had since long before they moved here.

In the density creeps story, that characteristic is density.  from the proposed development site are highrise buildings, which are part of the highrise cluster that was built in the 1970s, 20 years before the density creeps moved here.  There are also four 4-storey apartment buildings that appear architecturally to date back to the 1950s on that one block alone.

In short, the kind of density they decry, along with the attendant impact on property values and population demographics, were well-established in the neighbourhood long before they even arrived.

(Which makes me want to flag a lot of the commentary on this story with #JournalismWanted - many commentators seem to be taking the density creeps at their word that this new development is somehow significantly denser or significantly cheaper than the established neighbourhood, when this allegation could be disproven with a simple google, or by going to the site (conveniently located just 4 blocks north of Eglinton subway station!) and taking a quick look around.)

But the density creeps aren't the only ones I've seen doing this.  Far more frequently than you'd expect, mostly on the internet but sometimes just walking down the street, I hear people who live here and, based on demographics, appear to have moved here recently and to have had a choice in the matter (i.e. they're old enough and employed enough to live independently of their parents, but young enough that they definitely didn't move here before the 21st century) complain about things like density or highrises or chain stores or yuppies - things that have all been here since before the 21st century, and things whose presence you can easily detect by walking down the street.  If you don't like those things, you can see that the neighbourhood isn't for you the moment you emerge from the subway.

The other thing is, this isn't the cheapest neighbourhood.  If you want lower density or lowrises or fewer chain stores or fewer yuppies, there are other neighbourhoods that meet those characteristics and are cheaper to live in. So what are they doing here?


Despite the criticism from some quarters, this isn't the worst neighbourhood in Toronto.  We're generally closer to the top than to the bottom for indicators such as amenities, services, accessibility, quality of schools, quality of housing stock, infrastructure, lower crime rates, etc. 

I wonder if people in neighbourhoods that are worse in all these areas complain as much as the residents of Yonge & Eg, who, by all appearances, could totally choose to live elsewhere?

Sunday, May 24, 2015

Various thoughts on various kinds of prejudice depicted in Call the Midwife (full spoilers)

1. In one episode, the expectant parents with the Medical Drama of the Week happen to be a black couple.  It's mentioned in passing that they're from another country, and their accent suggests somewhere in the Caribbean (I'm not familiar enough with Caribbean accents or history to narrow it to a specific country, and further details were not given on-screen.)  The husband is a bus driver, and they live in one of the nicer flats portrayed in the series (clean, well-lit, decorated, not overly cramped).  As I watched this, I appreciated that they managed to portray the real-life diversity of London in a matter-of-fact sort of way that wasn't limited to discrimination plotlines.

In the next episode, there was an Irish family that was living in squalor and destitution because people wouldn't hire them or rent housing to them on the grounds that they were Irish. My first thought was surprise that after people would even consider holding such petty prejudices so soon after WWII.  But then I was even more surprised that in a time and place where English people would discriminate against Irish people for employment and housing, black people could successfully get employment and housing!  It seems like black people would seem more Other to the white English majority.

They did show a black patient facing prejudice in a previous episode (I can't remember if they've shown Irish people not facing prejudice) and before the Irish episode I was able to handwave the fact that this more recent black couple wasn't facing prejudice with the intellectual understanding that showing diversity outside of discrimination plotlines is a good thing, but after the Irish episode, I had more trouble getting past it, feeling like we needed an explanation of why they didn't face discrimination.

2. In one episode, a young man was discovered to be gay when he fell into a police sting operation, where the police had an undercover officer hanging out in a public washroom trying to instigate a tryst. I'm well aware that homophobia was far more rampant in that era, but I'm surprised they'd consider that a good use of police resources!

3. In the same episode, the neighbourhood had their  Rose Queen festival, where tradition dictates that the new Rose Queen is crowned by last year's Rose Queen.  As it happens, last year's Rose Queen is the wife of the young man who was discovered to be gay.  As a result, there was vocal outcry about her participating in the Rose Queen ceremony.

I kind of surprised that the woman who unwittingly married a gay man wasn't seen as a victim.  I kind of surprised that the fact that she was pregnant didn't count in her/their favour.  But more than anything, even given the ignorance and homophobia of the era, I was surprised that someone would get from "Her husband is gay" to "So, naturally, we can't possibly have her fulfill the duties of the outgoing Rose Queen!"  It's so inconsequential, and so irrelevant to her husband, and so ephemeral, I was amazed that the people of Poplar had time in their busy, hardship-filled lives to think about it.

4. After Patsy attends a particularly emotionally devastating birth, she goes to visit Delia for comfort. She lets herself into the nurses' home where Delia lives, goes to Delia's room, and sits on her bed crying while Delia consoles her. After the first wave of sobbing is over, Patsy reassures Delia that she'll be out of there very early in the morning, so "no one will ever know I was here".

It surprises me that anyone in that era and setting would even conclude "Patsy is in Delia's room crying" = "Clearly, they're lesbians!" Patsy used to work in that hospital (and, presumably, used to live in that nurses' home) and, since Delia is her best friend, they've probably spent a lot of time hanging out in each other's rooms, much like the secular midwives at Nonnatus. And, since they're both young nurses, this probably isn't the first time one of them has had an emotionally devastating nursing experience.  If anyone wonders what's going on, they'd simply have to tell them the truth: Patsy just came from a delivery of undiagnosed twins, the first one stillborn and the second still alive, and after struggling to keep a brave face throughout the ordeal for the sake of the patient.  So now she's talking through it with her best friend and fellow nurse, just as they always did about emotionally-difficult cases when they worked together, in a place where they would have frequently hung out when working together.  Given that same-sex relationships weren't seen as "normal" or common in those days, I'm surprised that they think people would arrive at "They must be lesbians!" rather than "Poor Patsy, she had a rough day!"

5. But just a few episodes later, Patsy and Delia decide to get a flat together.  And they don't seem too worried about people finding out about their relationship.  "Lot of girls share flats," they say, "Not even a nun would bat an eyelid."  Again, I found this hard to reconcile with their previous fear of being caught talking in Delia's room together.  If you can't even be seen hanging out in your best friend's room in a way that's been established as perfectly normal among nurses who work together, aren't people going to raise an eyebrow when you start living together in your own flat?

Saturday, May 16, 2015

The folly of condemning a boycott

There was recently a story tweeted into my feed about proposed "zero tolerance" for boycotting Israel.

This reminded me of something I've seen in US contexts: when there is a boycott of a business because of its business or labour practices, there are some commentators who say it's unethical to boycott the business in question.

This is ridiculous and unworkable.


I want to make it clear, I don't have a horse in this race.  To the best of my knowledge, none of the products I regularly buy or consider buying are from Israel.  All the cases I've heard of where people are talking about boycotts as though they're unethical have to do with US retailers that aren't available to my Canadian self.  I don't even have an opportunity to make these decisions, so I'm writing here solely as an external observer.  And as an external observer, I just don't see how boycotting could be unethical or something that you could have "zero tolerance" for, because of the very nature of a boycott.


What is a boycott?   It's choosing not to deal with a person or organization because you oppose some action or policy of theirs. (For syntactic simplicity, in this post I'm going to talk about boycott in terms of choosing not to buy from somewhere, but this can extend to all types of boycott.)


 So if boycotting is unethical or punishable, that would mean that, in order to behave ethically or to not be punished, you are required to buy from them.

And that's clearly unworkable.  The vast majority of people don't buy from the vast majority of sources the vast majority of the time.  Sometimes there's a better source, sometimes there's a more affordable source, sometimes there's a more readily available source, sometimes we simply don't need or want or can't afford the product in question.  If you're going to condemn people for not buying from somewhere, you'd have to condemn nearly everyone in the world.  (And on top of that there's the question of people who have bought from there but not recently. How do you tell if they've moved from buying to boycotting or if they just haven't needed to buy anything lately?)


At this point, some of you are thinking I'm oversimplifying things. After all, a boycott isn't simply not buying from somewhere, it's making a concerted choice not to buy because you oppose the source's policies and/or actions.

So let's follow this to its natural conclusion. If the anti-boycott people are okay with consumers simply happening to not buy certain products or services as a result of the natural course of their lives, but are opposed to us making the deliberate, mindful decision not to buy from certain sources to disincentivize them from behaviour we believe to be harmful, that would mean that the moral/legal imperative to buy from the source is triggered by the source's harmful behaviour.  If the source behaved in a way we considered appropriate, we wouldn't want to boycott them and therefore wouldn't be obligated to buy from them.  But as soon as they engage in behaviour we find unacceptable, we're obligated to buy from them in order to avoid engaging in the allegedly immoral/punishable act of boycotting.

Which is, like, the exact opposite of how market forces are supposed to work.  (Noteworthy because, I've noticed, many of the people saying boycotts are unethical seem to value market forces otherwise.)

Tuesday, May 12, 2015

Why are manufacturers pushing detergent pods?

I'm signed up for various free sample and coupon sites, and I've noticed recently that they are really pushing detergent pods, for both laundry and dish detergent.  Samples are only ever of detergent pods, never regular liquid or powder detergent, and now I'm finding sometimes you can only get coupons for the pods, not for the regular detergent.

I wonder why they're pushing them so hard?

I have found that, without exception, the detergent pods are far inferior to regular liquid detergent (and to old-fashioned powder detergent.)  They simply don't break up in the machine when used as directed, so you have a half a pod, a few clumps of detergent powder, and a not-fully-clean load of laundry or dishes. 

On top of that, detergent pods seem like they'd be more expensive to manufacture than regular detergent, because you'd have to make the different components and then combine them all into a pod and count out a specific number of pods into each container, whereas with liquid or powder detergent you can just manufacture it in bulk in a giant vat and dispense it into containers.

Even if there is some reason I can't see why some customers might prefer pods, why are manufacturers pushing pods to the exclusion of regular detergents?  What is gained by trying to urge us away from the more effective product that's easier to manufacture?

Monday, May 11, 2015

Stressing about stress

As you've noticed if you've been reading me these past few months, I've been getting stressed about various things that I think are too petty to be getting stressed about.

And, I realized, the very fact that I was getting stressed about these things was stressing me out.  In addition to dealing with or coping with the stressers, I was stressing about the fact that I was dealing with or coping with the stressers less perfectly than I thought I should be.

Because of that, this blog post was originally going to be about the balance of self-care vs. self-improvement. On one hand, maybe I should just take an "it is what it is" approach during high-stress times - deal with what's actionable, care for myself the best I'm able to, get through it, and regroup when life stabilizes.  On the other hand, I'm not going to become a competent and adequate human being if I baby myself instead of treating the areas where I'm not a competent and adequate human being like problems!


Then two things happened:


First, one day, about six weeks after my I got my computer back from the depot drama, I got out of the shower to find my apartment flooded with golden morning sunlight.  I put on my bathrobe, made a cup of coffee, and sat in the sunshine with my hot coffee and my wet hair, being warmed up inside and out.  It was peaceful and delightful in a way I hadn't experienced in quite a while.

Despite the fact that I have my morning coffee in the sunshine every sunny morning.


During one of my computer-less days during the depot drama several weeks previous, I'd been sprawled on the living-room floor in the sunshine reading the newspaper, and yearning for idle aimless internetting.  I thought back to when I was a teen, and sprawling on the floor in the sunshine reading the newspaper was one of my favourite ways to spend a weekend afternoon.  So I started worrying about what happened?  Why wasn't this good enough for me?

But in that contented morning sunshine several weeks later, I realized that the stress of the computer drama (and the stress over the fact that I was stressed by the computer drama) was actually making it impossible for me to enjoy the simple things in life like my morning coffee.  It's like when your Sim's "Tense" moodlet is too strong - you could be drinking coffee and sitting in a beautiful room and listening to music, and none of those things are going to outweigh the tense.  So I hadn't lost my ability to enjoy simple pleasures, I was just at a stress level that was beyond what simple pleasures could achieve.


The second thing that happened was my little breast lump adventure. Even in the shock of getting a telephone call telling me I needed a mammogram (when I didn't know that was a thing that could happen at that point in the diagnostic protocol), I wasn't nearly as stressed as I was with my computer out for repair and no fanfiction to tide me over.  Why on earth was this??  WTF is wrong with my priorities???

After some thought, I came to the realization that I wasn't as stressed during the breast lump incident because I felt like I was allowed to be stressed about it, so I wasn't stressing about being stressed.  I'm allowed to be stressed!  I have to get a mammogram at the age of 34 FFS!  So I just flipped the world the metaphorical bird, had comfort food and wine (for which I got carded - if there hadn't be a dudebro behind me in line, I would have actually called the cashier out on that), and got myself through that night and off to the clinic the next day. I'm not sure if anything else got done that day, but it didn't matter.  I went from thinking my first mammogram would be in 15 years to learning my first mammogram would in fact be in 15 hours, and I had to assimilate that information and deal with the mammogram process and all the attendant what-ifs.  I just got through it, regrouped on the other side, and life proceeded with as little stress as humanly possible under the circumstances.

Reflecting upon this, I realized a similar thing happened after my grandmother passed away.  My employer gave me a certain amount of bereavement leave, so I made the decision to use this time to process the experience however I needed to.  Apart from any duty to my family, I decreed to myself that I wasn't required to do anything specific during those days.  A day spent doing nothing but gaming, drinking, and eating cheese was totally allowed. A day spent in bed watching Eddie Izzard videos was totally allowed.  If I felt the need to do something completely uncharacteristic like take a long walk in the woods, that was totally allowed.  There was no wrong way to use my time.  And because I wasn't worrying about my day-to-day (I was allowed to do whatever I wanted, and if I found myself at a loss the system was still there), I didn't stress, just processed my bereavement as much as one can in six days and then returned to work on Monday.


So from all this, perhaps I can conclude that if I give myself permission to be stressed by the things that are stressing me, they won't stress me as much.

But, on the other hand, I'm very good at justifying self-indulgence. And I don't think you get to be good enough by telling yourself it's okay to not be good enough.

Sunday, May 10, 2015

Another reason why early sex ed will lead to less early sex

This post was inspired by, but is not directly related to, this quiz testing how much you know about the new Ontario sex ed curriculum. (I got 9/10.)

Some critics of sex ed criticize teaching students about various sex acts at an age that is generally perceived to be too young to be engaging in those sex acts.

But it occurs to me that if your goal is to prevent young people from having sex, introducing the concepts early would probably help achieve that goal.

I was informed, via age-appropriate educational books, about the existence of various sex acts years before I was ready for them (which was a good thing, since I reached menarche years before I had the slightest even theoretical interest in sex), and every single time my visceral reaction was "Ewww, gross!!!!"  As I evolved in the direction of developing interest in sex, I had to overcome the "Ewww, gross!!!!" before I could develop positive interest.

I also learned of various other sex acts, via the internet, when I was older and ready to have sex.  In these situations, my reaction was either "Hmm, interesting..." or "Meh, not for me."  Even for the sex acts I find more distasteful (which are objectively more distasteful than any of the sex acts I learned about before I was ready for sex) I never reached the same level of visceral revulsion as I did before I was ready to have sex.

So if you want young people to not have sex, telling them about sex when they're young enough to think that it's gross will introduce an additional emotional barrier that will stand between them and their desire to have sex for a certain period of time.

Things They Should Study: do more apartments get too hot or too cold in shoulder seasons?

I was very happy to hear that the City of Toronto is consulting the public about indoor temperature bylaws for rental housing.  I'm miserable for a week or two every May and September because the weather is hot but my landlord is legally required to provide heat (and, therefore, can't have the building's air conditioning turned on.)  So I was all set to write a submission advocating for air conditioning to have precedence over heating during shoulder seasons with warm daytime highs and cool overnight lows.

Whenever air conditioning is available, I set my thermostat to 25 degrees, which is the highest it will go. And the air conditioning switches on nearly every single day.

In cool weather,  I set my thermostat to 20 degrees, which is the lowest it will go. And the heating switches on an average of once per year.  Some years it's one time, some years it's two times, some years it's zero times.  Last winter, it was zero times.

Therefore, I strongly advocate for air conditioning taking precedence over heating in the shoulder seasons.  Even if it gets cold in your apartment overnight, you can just snuggle up under an extra blanket.  Certainly a fair price to pay for being comfortable during the day!


But as I was writing this, occurred to me that this could be studied comprehensively for a wide variety of housing types.  Get residents of buildings of a wide variety of sizes, ages and constructions, with the sample including apartments with exposure in each direction (and corner units).  Have the study participants agree not to use heating or air conditioning during the study period, and to using optimal temperature management practices otherwise (e.g. blinds open to let the sun in if it's cold out, blinds closed to keep the sun out if it's hot out, windows open if you want the indoor temperature to move in the direction of the outdoor temperature, windows closed if you don't, minimize use of electronics and appliances if it's hot, etc.)  Then track the temperature inside the apartments, and have residents record their comfort level.

Perhaps they could come to a definitive, evidence-based conclusion about whether heating or air conditioning should be prioritized.  Perhaps they could come to a definitive, evidence-based conclusion about whether more people and homes get too hot or too cold in the shoulder seasons in the absence of appropriate indoor climate control.  Maybe there are patterns based on type or age of building, and bylaws that take that into account would be more appropriate. 

We already know the current bylaw does not reflect the needs of our current climate and housing stock.  We should take this opportunity to do research and identify what exactly our needs are, and write a bylaw that reflects that.