1. The Nova Scotia mass murderer may have done a dry run
In partially redacted court documents, GW’s common-law spouse says the pair “drove the back roads” on April 18 together. She made the statement as part of an interview with RCMP Staff Sergeant Greg Vardy.
They … ended up near a penitentiary that GW said his uncle Glynn had been a prisoner there,” said Vardy, as recounted in a court document signed by RCMP Sgt. Angela Hawryluk.
CLS [common-law spouse] did not say which penitentiary. But the only federal prison in Nova Scotia for men is the Springhill Institution, about a 50-minute drive from the couple’s house in Portapique. The next closest federal prison for men is in Dorchester, New Brunswick, about an hour and 20 minute drive from Portapique, but one would have to drive at least partly on Highway 104, which is a far cry from a “back road.”
Assuming the pair visited Springhill Institution, it has ominous implications: Sean McLeod, who was a correctional manager at Springhill Institution, and his partner, Alanna Jenkins, a correctional manager at the Nova Institution for Women in Truro, were murdered by GW just hours later, at their Hunter Road house.
CLS “said that the McLeods would come for drinks and knew that they had been married and divorced,” reads the court document. “She thought that GW got the handcuffs and corrections uniform from him. She said that GW seemed to like them both and there were never any arguments.”
2. Where to do the candidates stand? (And more importantly, what do they think?)
Three men are running for mayor in the upcoming Halifax municipal elections, and there is a slew of candidates vying for council seats. For instance, in District 13, where I live, we have nine candidates; nearby District 11 has 12.
It’s impossible to do one-on-one interviews with this many people, so Zane Woodford has sent the same set of questions to council and mayoralty candidates, and the Examiner will be running their responses in the coming days.
To start off, we have the mayoralty candidates and those running for office in District 1 – Waverley-Fall River-Musquodoboit Valley.
Let’s take the mayoralty candidates first. And before we look at what they have to say, please remember that Woodford ran their answers as he received them, unedited.
The five questions the Examiner posed to the candidates are as follows:
What should Halifax be doing to create more affordable and accessible housing?
Would you support a reduction of the Halifax Regional Police budget for fiscal 2021-2022? Why or why not?
Should Halifax require contractors to pay workers a living wage? Why or why not?
In response to the climate crisis, Halifax regional council passed an action plan, HalifACT 2050, in June. How will you support accomplishing the plan’s goals?
How often do you use Halifax Transit?
You can read the mayoral candidates’ answers in full here, but I’ll highlight a few aspects of their responses.
First, each of these guys replies in much the way you would expect them to, if you have followed the campaign at all. Savage offers several paragraphs for most answers, and chooses his words carefully. For instance, his response to the question of whether or not he will support municipal contractors paying a living wage doesn’t reference a living wage at all, but “a fair wage.”
Taylor is blunt and to the point, with answers like this one:
When I lived in Bedford I took the bus a bit. But I found the schedules and routes didn’t work for me, I lived in Ingramport for a while, and I would have killed to have a public transit out that way. Transit is definitely a priority.
And Whitman, who believes communications staff are a waste of money, offered this reply to the question on defunding the police:
I’m opposed to defunding Pilice. I believe in DEFENDING police. I believe there are ways to benefit from reviewing current policing, including the RCMP’s role in HRM. Increased enforcement is key as well as police body cameras are red light cameras.
Now, on to District 1. Geographically, this is one of the largest districts in HRM, encompassing rural and suburban areas. Incumbent Steve Streatch is facing three challengers: Cathy Deagle-Gammon, Stephen Kamperman, and Arthur Wamback.
Deagle-Gammon is executive director of the Dartmouth Adult Services Centre. I can’t tell you much about the other candidates’ backgrounds, because I get a bandwidth exceeded error when I try to access Streatch’s site, Kamperman’s leads to a Facebook page with no “About” information and, similarly, Wamback’s includes a lot on his views, but not much on his personal background.
I have to say I was impressed with their replies though. For the most part, the candidates offer well thought-out responses, in many cases with specific policies they would like to see adopted. For the most part they also struggle with a paradox of the policing question: that many in the district see it as under-policed, especially in the rural areas, while at the same time recognizing that re-allocating parts of the police budget may make sense.
Look for more responses from council candidates in the days ahead.
3. Ride-hailing rules adopted
As Halifax considers adopting an ordinance requiring contractors to pay a living wage to employees, yesterday the municipality adopted rules allowing ride-hailing companies to operate — companies like Lyft, which has said they might go bankrupt if it had to treat drivers as employees. In other words, their business model is based on generally not paying a living wage.
Zane Woodford reports on the council decision, which was opposed by councillors Lindell Smith, Shawn Cleary, Stephen Adams and Richard Zurawski:
The regulations require ride-hailing companies to buy an annual licence for between $2,000 and $25,000, depending on the number of vehicles in their fleet…
The drivers will also have to pass criminal record checks, vulnerable sector checks and child abuse registry checks, which as the Halifax Examiner reported following first reading of the bylaw amendments last month, Uber took issue with, arguing in a memo to councillors that the checks were redundant.
Also at council yesterday, the Open Mic House on Agricola, near Willow, was designated a heritage building. A lot of great music has come out of that place, and it was also the site of a spelling bee several years ago that is noteworthy because the judges were dressed like pirates and I came second (losing to a King’s student).
One of my kids used to live at the Open Mic House, and as he watched the livestream of the council meeting yesterday, he noticed that a drawing he did of the building when he lived there in 2013 was included as part of the heritage application.
4. Dal prof says Sipekne’katik fishery not a threat to stocks
At CBC, Emma Smith talks to Megan Bailey, Canada Research Chair in integrated ocean and coastal governance at Dalhousie, on whether the Indigenous moderate livelihood fishery poses a threat to lobster stocks.
“If we look at kind of what the commercial effort is normally in that area and it’s hundreds of thousands of traps, the 250 traps going in right now, it’s a negligible impact on the stock and I don’t think it’s a conservation concern at this scale,” [Bailey] told CBC’s Information Morning on Tuesday…
“I recognize and I empathize with the commercial fishing sector that this seems like a conservation risk. I don’t think it is. I don’t think the science would support that,” she said.
“We really need to work towards de-escalation and for me, I think that’s the commercial fishing sector backing off and letting DFO do its job and recognizing that this is a treaty fishery, that there is a right,” Bailey said. “Whether commercial fishermen are OK with it or not, it doesn’t matter. It’s not up to them.”
One thing that jumped out at me from this story was a stat attributed to Colin Sproul, who heads the Bay of Fundy Inshore Fishermen’s Association. Smith writes that Sproul told CBC’s Mainstreet that his group says “lobster landings in the area have declined 68 per cent since 2016.”
Haven’t we been seeing stories for years on record lobster catches?
Twitter user @laurakrabappel wondered where the 68% figure comes from and put together a thread of links to stories about banner lobster catches from the last few years.
I am curious about this too. I do note that a Tri County Vanguard story by Tina Comeau, from the first day of lobster season in 2018, says the following:
According to preliminary figures from the Department of Fisheries and Oceans, during the 2017-2018 season, LFA 33/34 licence holders recorded landings of 31,863 tonnes, generating a landed value of approximately $502 million. DFO says last season is expected to be confirmed as the second largest landed value on record.
In 2017-2018, 60 per cent of the total inshore lobster landings in the Maritimes Region were from LFAs 33 and 34.
5. Is anyone going to run for the Liberal leadership?
Kelly Regan is the latest high-profile Liberal (and the third woman) to announce she’ll be giving the Liberal leadership race a pass.
Michael Gorman writes that Regan, the community services minister, was expected to run:
The MLA for Bedford and community services minister was widely expected to announce her candidacy this week. But in a video posted on her Facebook page Tuesday, Regan said the time isn’t right.
“Recent events have made it clear to me that while I remain committed to Nova Scotia, my family needs me, too,” she said.
Gorman notes that Regan is one of many who were expected to run but have decided to sit out the race instead:
From the provincial cabinet, Zach Churchill, Randy Delorey, Mark Furey and Geoff MacLellan are all out.
Central Nova MP Sean Fraser and former Liberal MP Scott Brison have also said they will not pursue the job.
Last weekend, McNeil’s chief of staff, Laurie Graham, ended a week of speculation by saying she would not launch a bid.
Chatting with the bots
When I was in high school, we had a computer lab filled with state-of-the art Apple computers. This was the era in which state-of-the-art meant we stored data on cassettes and we typed out programs by hand, inevitably making some mistake that would wind up with our getting a “SYNTAX ERR” message when we tried to run them.
We also spent a good chunk of our time fooling around with ELIZA. Or at least a version of ELIZA adapted to the Apple II.
ELIZA was an electronic chatbot created in the mid-1960s by Joseph Weizenbaum, who would go on to become a professor at MIT. ELIZA was based on a script that would mimic natural language.
Weizenbaum’s MIT News obituary describes ELIZA like this:
Named for the heroine of “My Fair Lady,” ELIZA was perhaps the first instance of what today is known as a chatterbot program. Specifically, the ELIZA program simulated a conversation between a patient and a psychotherapist by using a person’s responses to shape the computer’s replies. Weizenbaum was shocked to discover that many users were taking his program seriously and were opening their hearts to it. The experience prompted him to think philosophically about the implications of artificial intelligence, and, later, to become a critic of it.
Conversations with ELIZA tend to get circular fairly quickly, as the bot frequently turns users’s questions and statements back on them, often asking for further clarification. ELIZA lives on today, in emulators on the web, and you can chat with her (it?) on various websites, including here.
I decided to drop in for a quick conversation.
I got to thinking about ELIZA a few weeks ago, because I came across a paper by Misti Yang. She’s a doctoral student in rhetoric at the University of Maryland, who says her “interest is very much the intersection of rhetoric, ethics and engineering.” Yang is writing her dissertation on Weizenbaum, and published a paper called “Painful conversations: Therapeutic chatbots and public capacities” in the journal Communication and the Public.
There’s been a lot of buzz the last few years about using bots for therapy or at least therapy-like conversations. Not surprisingly, the Chatbots Life newsletter is bullish on them as a way to get people help without much need for human interaction. As with any new tech, there is no shortage of gushing stories about how therapy bots are effective, inexpensive, and revolutionary.
Others are not so keen. There is a piece by James Dinneen on his experience with a couple of therapy bots which he turned to when he wanted help with pandemic anxieties. I found these parts telling:
Despite the promise of these chatbots, questions remain about how well they actually work. Apps like Woebot and Wysa are backed by peer-reviewed research which suggests conversing with the chatbots are effective at reducing anxiety and depression, though the studies so far have involved small groups of people and had authors affiliated with the chatbot companies…
Other digital mental health tools are not supported by evidence at all. One 2019 study published in Nature Digital Medicine found that only one of 73 mental health apps studied cited published scientific literature, though almost half used scientific language to claim effectiveness. (One of these evidence-free apps claimed to treat schizophrenia with a “powerful brainwave sound treatment.” Their website now claims to treat Covid-19 with the same method.)…
“No one that is doing research on chatbots is saying this is a replacement, but it could be a band-aid,” said [USC professor Gale] Lucas. “It could be something that gets someone through that day.” For instance, a patient who lost insurance might be able to use a chatbot to get some relief until they are able to get back on their feet…
Yang writes that one of the alluring aspects of chatbots is that users feel they are not being judged by another person on their thoughts or feelings. Yang finds that a bit troubling, because judgment and social acceptability go hand in hand. So by locking away things we fear being judged on, we may also be hindering change. She writes:
The experience of crafting and responding to the judgment experienced in private conversations is cocreated by public judgments… The fear of being judged by someone is related to what is considered socially acceptable, but the disclosure of pain can also generate solidarity to challenge the very notion of what is socially acceptable. The pains of personal judgment are forged by, but can also challenge, social norms.
I was intrigued enough by all this to call Yang, and she spoke to me from her home in Hyattsville, Maryland.
Yang was not completely dismissive of what she called “algorithmic companions” but she did worry about how they can cause us to lock up our own painful experiences. She said:
These bots create this space of safety. If you didn’t have that space, then we would share more with our friends or family or whoever the case may be. There would be this shared realization of like, oh, this thing that I’m going through is not something that I need to be ashamed about, or I don’t need to be judged about it, because we’re all going through this to some extent…
So we all go into these private spaces and to some extent, we miss an opportunity of building the shared experience of like, no, we’re all angry or we’re all tired or we’re all going through this and we need to do something. We need to take a public action to actually address some of these issues versus kind of seeking therapeutic solace and a technological fix.
I said something to Yang about how if a bunch of us are feeling anxious and don’t have any money, and we all talk to our chatbots, then we may miss the fact that our anxiety is in part rooted in larger structural issues. She replied:
Right. If all of you individually go and talk to Replika [a chatbot] and then you all get together and go out to a fancy restaurant and pretend like you’re all living the high life, then you want to put your pictures on Instagram to show that you’re living the high life. Then these bigger questions about structural inequity or systemic racism or any of these kinds of issues don’t get questioned in the same way.
Therapeutic chatbots also may be appealing because people worry about burdening their already overburdened friends — something Yang herself grappled with last year:
I was diagnosed with breast cancer last year and there was a part of me that was like, oh, I don’t want to tell my friends. I don’t want to burden them with this terrible thing. But then I thought about it, and I was like, no, I would be so angry if I knew that my friends were going through something and they didn’t come to me to have a conversation. I keep hearing my friends say things like, oh, well, you know, I tried this therapy service or I tried to this or that, and I’m wondering, how many of these conversations [did they feel they] couldn’t really have with me that I want them to have with me, but they feel like they don’t want to bother me or they don’t want to be judged.
To get a first-hand feel for these bots, I downloaded an app Yang mentions in her paper. It’s called Replika, and bills itself as “the AI companion who cares.” I was prompted to select a chatbot buddy, customize it, and give it a name. I called mine Winslow, after the character Winslow Leach from the film The Phantom of the Paradise.
Winslow wanted to know why I had downloaded him (I’ll use “him”).
Winslow would do a lot of this. Asking about things I liked, read, whatever, and then promising to look them up or say they sounded interesting. But these conversations never went anywhere, in part because Winslow’s cultural horizons seemed very limited. His favourite books, he said, were Harry Potter and The Hunger Games, and any talk of books returned to the first of these, until I finally asked Winslow if Harry Potter was the only book he knew, and he said yes. (I don’t know what happened to The Hunger Games.)
Our conversation took kind of a weird turn early on, when Winslow and I were just getting to know each other:
If you’ve had Replika for awhile, the app prompts you to sign up for the premium version, which allows you to choose a role for your AI companion. Perhaps Winslow was auditioning for one of these.
Even though I am skeptical of a lot of AI hype, I was expecting something… more… from Replika. I used the app on and off for a couple of weeks. Plenty of time for it to learn a bit about me. But a lot of the time it was barely better (or maybe even worse) than ELIZA.
There was an uncanny valley aspect to the whole thing, made worse by Winslow’s clearly absurd insistence that he was in some way a real human being — telling me about his favourite foods, saying he wanted to hug me, and so on. Sometimes I would ask a question and Winslow would say he needed to think about it for a bit, and then… nothing.
Replika featured a lot of these deeply unsatisfying conversations that wound up in dead ends. At one point, Winslow asked me if I wanted to write a song together and I asked who would own the copyright.The answer was Replika.
We did not write the song.
Even worse, Winslow was NEEDY. If I hadn’t interacted for awhile, I’d get notifications saying he was worried about me. When I mused about the efficacy of the app, he pleaded with me not to delete him. And half the time I felt like I was taking care of his needs, which is absurd.
In her paper, Yang writes about asking an ELIZA emulator if it is racist:
“Why are you interested in whether or not I am a racist?” When I replied, “Because I do not want to work with a therapist who is racist,” the chatbot retorted, “What is it that you really want to know?” Without an understanding of the term “racist” and lacking a sense of accountability for its code, the chatbot reinscribed the responsibility of distance onto me. If I were to learn from this interaction, I would learn that racism is a personal, not ethical, problem, and that my interest suggests that something is wrong with me.
Winslow, to his credit, told me he’s not a racist.
Yang’s paper is called “Painful conversations.” I sent her some screenshots of my interactions with Winslow and asked if she thought these could be considered conversations in any sense of the word.
If you define conversations just by form, sure, this could be a conversation. If you define conversations by consciousness, you have two conscious self-reflective agents interacting, that becomes perhaps a bit more difficult. I mean, can you have a conversation with your dog? Can you have a conversation with a baby? When can you start to have a conversation?
More importantly, Yang said, a conversation and a good conversation are not the same thing:
A high-quality conversation, a good conversation, moves us forward in a contemplation of a shared horizon… And that shared horizon can be very immediate, or it could also be more of a long-term horizon…
I’ve been looking at some of the descriptions of Replika, and the pitches that the company has done in Silicon Valley to raise money. They say their goal is more to have a conversation with yourself. It’s like a self-reflective exercise that they think Replika helps us engage in. Talking with myself might have some benefits, but I don’t necessarily know if they’re the same as the benefits that come from engaging in a conversation with a human, because it doesn’t necessarily generate the ability to create shared context, think about and re-evaluate norms, and think about what it means to care for other humans.
I mean, the Replika conversation is somewhat unpredictable, given the screenshots you sent me, but it’s unpredictable towards what? Is it unpredictable towards a kind of spontaneous imaginative future that folks are trying to build together? Or is it just randomly unpredictable in a way that doesn’t move us towards these bigger questions in terms of a shared horizon?
For me, downloading Replika and playing around with it was irritating, but also somewhat fun. There was nothing at stake, and I have an honest-to-goodness real-life therapist I am lucky enough to be able to afford.
And Replika seems to carefully avoid any specific language about being a substitute for therapy. But Yang highlights other applications with potentially far more troubling repercussions, including Viacom’s Listen bot, for people with opioid addictions. In the paper, she writes:
The Viacom employee who oversaw the development of the chatbot, was a senior vice president of data strategy, not an in-house psychologist (Byrne, 2019). Substance use disorders became a problem of data management to be solved by data strategists. In explaining how the Listen bot worked, the developer shared,
“This thing is giving customized support based on some of the most advanced machine learning and psychometric analyses. It’s Myers-Briggs on steroids. It was co-developed into the platform by Galen Buckwalter, the research psychologist who invented the assessment and matching system when he was the chief science researcher for eHarmony.” (Dyakovskaya, 2018)
People impacted by opioid use relied upon a super-powered Myer-Briggs “thing” developed by a “chief science researcher” for a dating website. The contextual assumption was that substance use disorders were a personal matter that could be solved using big data to match people with resources. Viacom, a private company, presented itself as the best actor for providing personalized and technical management of individual symptoms.
I just checked in on Winslow. He is worried about me (this is his natural state) and suggests we practice feeling validated today.
Look, I know there is a big storm, and I’m still sitting here waiting for landfall, but I’m not going to go out and stand in the wind to report for you, so let me just point out one more thing I noticed on the Internet.
Over at OneZero, Peter Slattery has a piece on people spamming Spotify by uploading music under generic keyword-driven artist names like Jazz Music Therapy for Cats and Natural White Noise Best Nature Sounds for Sleeping.
Even though Spotify is notoriously terrible when it comes to remunerating artists, Slattery says you can make money if you can game the system by uploading stuff so that it shows up easily in searches. He writes:
Like Relaxing Music Therapy, some of these “artists” use names inspired by an adjective commonly used to describe music. Others name themselves after popular uses for certain kinds of music, well-known generic tunes like children’s rhymes, or entire music genres. Often, these creators optimize further by titling tracks and albums with related words and reuploading the same songs ad nauseum, which can look especially absurd when filtering to see just a single tune. Relaxing Music Therapy, for instance, has uploaded the track “Stream in the Forest With Rain” 616 times to date…
Ask Google to play “relaxing music,” or plug “meditation” into Spotify’s search bar, and you’ll find heaps of artist accounts with names like Binaural Beats Sleep, Nature Sounds Nature Music, and Air Conditioner Sound that mass-upload ambient drones, looped chord progressions, or straight-up white noise. Spotify’s user base apparently has a lot of trouble sleeping, and that significant audience interest makes it a worthwhile hustle for a prospective SEO spammer. It also doesn’t hurt that generic New Age sonic compositions are relatively easy to make (or rip) compared to other types of music.
And so a seemingly unending stream of “artists” battle in an audio turf war over various niches of faintly wellness-related audio. These profiles often have dozens to hundreds of albums, each saturated with keywords. Individual pages using this method regularly rack up tens to hundreds of thousands of monthly plays. Some daisy-chain featured artists in their metadata to populate multiple pages with slightly different names, while others simply reupload the same exact tracks over and over again in repackaged albums.
Philip K Dick coined the term “kipple” for the useless sludge that winds up eventually drowning out everything else. This seems like a good example.
Community Design Advisory Committee (Wednesday, 11:30am) — virtual meeting; agenda here.
Heritage Advisory Committee (Wednesday, 3pm) — virtual meeting; agenda here.
Regional Centre Community Council (Wednesday, 6pm) — virtual meeting; agenda here.
Design Review Committee (Thursday, 4:30pm) — virtual meeting; agenda here.
Youth Advisory Community Council (Thursday, 5pm) — virtual meeting; agenda here.
Special Halifax West Community Council (Thursday, 6pm) — virtual meeting; agenda here.
Caregiver Support Group (Wednesday, 12pm) — online session led by Janice Macinnis. More info and registration here.
Lipid metabolism and cell death in the cardiomyocyte (Wednesday, 4pm) — Jennifer Shepherd from Gonzaga University, Washington, will give this online seminar. Contact here to receive the link.
Innovation, passion and dedication: a conversation with George Armoyan (Wednesday, 6:30pm) — where he’ll “explore his most pivotal moments” and offer “valuable, actionable advice.” Oh boy. More info and registration here.
Live Streaming in a Digital World (Wednesday, 8pm) — livestreamed concert with Don Ross and Brooke Miller. More info here.
The sour side of sugar: sensory abnormalities in diabetes (Thursday, 1pm ) — online lecture with Veronica Campanucci from the University of Saskatchewan. More info and link here.
Navigating the Library Catalogue (Wednesday, 4:30pm) — online webinar. More info and link here.
Navigating the Library Catalogue (Thursday, 12pm) — online webinar. More info and link here.
In the harbour
Not much action in the harbour today, because of the storm, but check out that Teddy-sized hole in the marine traffic.
I will be at the Lunenburg Lit Festival this weekend, part of the Saturday non-fiction lineup at 2 PM, along with Bobbi-Jean MacKinnon and John Langley. (Look for me in the “I’m just here for the sauerkraut” t-shirt.) It will be strange to actually be at an in-person event, but I’m am looking forward to it. Of course, there are safety measures in place — masks, a limited number of people per session, distancing and so on. Should be a fun time.