Hormones are freaking weird.

Click to subscribe to the CysterWigs blog in your favorite reader app

Originally Posted on November 20, 2013 by Heather Hershey

And, might I add, hormones are also a huge pain in the you-know-what!

I was out of my prescriptions for spironolactone – a.k.a. aldactone – and metformin and lazy about refilling them for about a week.

Both of these pills are commonly prescribed to ladies with PCOS. The metformin controls two things: the amount of glucose created in the liver that is pumped into the blood stream and the sensitivity of body tissues to that glucose. Aldactone is a water pill that helps with bloat, but is also a testosterone antagonist, meaning that it minimizes the impact of androgens in women with PCOS related acne and hair loss.

So, if you can imagine, this was a generally bad lapse in judgement on my part.

I had epic cravings and epic bloat for that entire week. I returned right back to my original powerful lust for refined sugar. (It certainly doesn’t help that everything pumpkin is in season right now. I never met a pumpkin spice cake I didn’t like. Cream cheese frosting? UMDUHYESPLEASE.)

I could feel myself getting puffier.

I gained nine pounds in one week. This sounds really impressive, but I wasn’t eating any more than usual calorically. I WAS, however, eating a sh’load more carbs than I normally do.

This was clearly a big problem (no pun intended).

I filled my script and began taking my meds again STAT.

I lose five pounds in two days! This is an even more impressive finding. My tummy is flatter and the cravings are completely gone.

The moral of this story is: Remember to fill your PCOS prescriptions early and, if they are working for you, keep taking them.

I am fearless.

Click to subscribe to the CysterWigs blog in your favorite reader app

Originally Posted on December 05, 2013 by Heather Hershey

Alternate Title: “How I intentionally botched a massive final for my Master’s Program due to philosophical irritation.”

Some background: This is the essay I wrote to the AI Department at the University of Georgia the very day I decided to leave my Master’s Program. (I’ve never regretted the decision!) 

George Boole


Today I sat down for a final in what is arguably the most important class of my Master’s in Artificial Intelligence (AI) program: Philosophy of Artificial Intelligence, or PHIL6500.

I was as prepared as I possibly could be, arriving at the testing center two hours early armed with notes for some last-minute review. Truth be told, I was a basket case. My primary background is in behavioral science. I was prepared to take a difficult philosophy class with some logic and some heated discussion when I registered for this course. That isn’t what I got.

I realized while browsing through my handwritten notes today that the teacher – whose primary field of study is computer science (CS) – had neglected everything in Peter Norvig’s massive AI textbook related to psychology, statistics, PHILosophy, history, and CONTEXT in favor of basic search algorithms in Java you could learn in other computer science courses on campus and symbolic logic that all the AI students would encounter in multiple required logic courses later in the program. In other words, none of this stuff mattered…and I was amazed by my emotional reaction. How dare the school turn something so important into just another basic computer science class?!

When the time came, I scrambled to write my answers neatly in the provided blue book. Page one and two were easy, three a bit rougher, and the fourth was generally tedious but manageable, the way knowledge bases were intended. The fifth page, however, is where I got lost. It was the final page of the exam questions and I completely blanked. The question was a simple application of resolution elimination. I could have done it easily, but I just couldn’t shake the feeling that something was wrong.

This is where things get crazy. I took a deep breath and suddenly all the ambient noise in the room – the loud air system, my professor’s cell phone clicking, the rustling of paper, the gentle sighs of despair and boredom, the grating fluorescent lights, the intrusive scent of sweaty graduate students behind me – all was silent in the tiny computational space between my ears. I began to drift away from my body as my mind spewed its inner narrative all over the remaining fifteen pages of my blue test booklet.

Here is my argument, in all its glory:

This essay was not requested in the test and will probably take up so much room that I will not be able to answer the final question. I know that I am probably going to fail this exam because of it, but I am fine with that. This is not the class I thought it would be. Actually, this wasn’t much of a philosophy class at all. I am beginning to get the impression that the AI department at this school is only multidisciplinary on paper and an extension of computer science in practice. If that is indeed the case, then why even bother making it a separate entity on campus? Why not make it a concentration option of the CS department if that is all you want to offer? I think the AI department isn’t simply a branch of CS precisely because artificial intelligence is not well served by an exclusive computer science paradigm.

I thought graduate school was a place for higher-level inquiry. Instead, this school’s curriculum thus far has been based on the simple memorization and application of existing formulas, concepts, and vocabulary without much discussion about WHY we think these are the best practices in our field. I’m sorry for sounding naïve, but I thought critical thought essentially revolves around the deeper levels of understanding that asking WHY can provide. Instead, WHY is reviled, suppressed, denied in favor of claims of absolute truth and logical soundness.

We, the graduate students, are broken until we no longer know how to think in our own fashion. We are then retrained to think exactly like you. Therefore, graduate school is where innovative thought processes go to die.

I will not be broken.

Now, I am not suggesting that I know more than you or anyone else in this room. This is not a matter of smug self-satisfaction. I still have far to travel on this road of life and, if I am fortunate, I will be able to learn some small fraction of wisdom about my innate curiosities and the inner workings of my own thoughts. That is why I am here. Ancora imparo, and all that jazz. However, I think that treating us like ignorant sheep that are only capable of mind-numbing repetition, pointless and arbitrary paper composition, and organized baby steps in research does us no favors and only serves to provide “grunt work” research for your CV.

Since industrial-era symbolic logic is the only thing of a philosophical nature we touched on, I would like to address it at this time.

I have a question for you: Why do so many very intelligent people have difficulties with symbolic logic? I think this is a relevant question since you have placed so much emphasis on this during the course of this class, despite the fact that other classes in our curriculum cover it better and more in depth. Could it be because you are attempting to distill the whole of human thought and communication, as well as all of the other chaotic influences of the natural world, into a highly structured, antiquated framework that is always logical, neat, and tidy?

Life doesn’t work that way. If you continue to do things as you’ve always done them — and refuse to think outside of the prevailing mindset — you will never reach full, general artificial intelligence.  Singularity will always tantalize, but never be fully attainable.

This query is particularly important in light of known limitations within the burgeoning study of natural language processing. NLP is something that, to-date, the field of artificial intelligence has struggled with precisely because it is bound to the rigid confines of symbolic logic in an illogical media (IE: natural human communication), which is always in flux.

Let’s take it back to the source. What about George Boole? Prior to Boole, logic was full of syllogism and pretty light on anything resembling algebra. Actually, if you think about it, algebra is just another language as all of mathematics is simply another way of describing what we as humans encounter during the course of our lives. It’s a means of measuring, but also a means of conveying information about said measurements. This gets lost very early in young people’s mathematics education and leads to a this-or-that mentality in which students are socially pressured at a young age to pick a preference without being given the big picture information about how the natural spoken language (we’ll use English as an example) is similar to, yet different from formal language, like symbolic logic and math. And prior to Boole, logic was much closer to spoken language than algebra.

If a sixteen year old eccentric teen boy from an impoverished home came to you and said that God had spoken to him in a dream and told him exactly how people think, would you give his ideas any credence? Of course not. People in Boole’s day didn’t take him very seriously, either, at least not in his early career. Posthumously we consider the man a genius, a grandfather of modern computer science and the father of symbolic logic. Boole had his teenage fever dream and I cannot help but think this is something we would be very dismissive of today. He had no formal education of any kind and taught on a University level without ever stepping foot on any campus as a student. Actually, he was persuaded against attending college for fear that it would interrupt his intellectual pursuits. His peers (DeMorgan chief among them) wanted to protect him and his mind from the corruption that occurs when professors compel their students to abandon their own intuitive deductive processes in lieu of those of the establishment. He probably wouldn’t have been able to afford the tuition anyway. (Some things never change.)

God told George Boole, at least according to Boole, that everything we encounter in life can be neatly divided into two opposing categories: TRUTH, which is where God lives and is represented in Boolean algebra as a number “1”; and FALSE, or the absence of truth (and therefore the absence of God: “the truth, the light, the way” – can you see what his Victorian mind was getting at?), represented by the null, or “0”. It doesn’t take a rocket scientist, or even an AI researcher, to realize very quickly that while this may be very useful in simple applications, this kind of reasoning hardly resembles anything like intuitive human reasoning processes.

Symbolic logic (SL) was supposed to mirror human reasoning. I mean it came from “God” to an uneducated teenager in a dream. What could possibly be faulty with this?


  • SL reduces words with meaning into nothing more than empty variables, operators, and (in first-order logic) quantifiers.
  • SL over-simplifies the nouns of natural language by replacing them with variables while placing unique emphasis on operators, like you would in math.
  • However, people – who are probably not using elaborate SL proofs to reason their ways through myriad sundry daily problems- generally place emphasis in speech on nouns, verbs, and existential claims.
  • Existential claims in SL are much weaker than statements such as “I am” are in English.
  • As mentioned above, SL was initially developed to describe human reasoning, as if classic logic were inadequate because it relied more heavily on natural language than mathematic language.
  • SL is used to “translate” natural language (like English) into a formal one (propositional calculus, predicate calculus) for ease of use. That means we use it because it’s convenient. To suggest otherwise is a lie.
  • Losing information via data “translation” of this nature is a notorious problem within artificial intelligence. It is one of the principle reasons why we, as a field, always seem to veer away form projects that require substantial knowledge of human thought processes in favor of projects that merely require a simulacrum of intelligence. We program for rationality and NOT intelligence.
  • This means that our field, though scientifically sexy, is way behind the trajectory for where we thought we’d be by now and way less advanced than the general public assumes.
  • A primary means of “proving” an SL argument is reductio ad absurdum, which basically just means you provide a negation for the statement and somehow it’s magically proven true. It’s an easy proof, but the mechanics of how something can become true just by saying it’s not still kind of mystifies me.
  • I think one of the first things we learn as children when acquiring language is that is not possible to answer every question in every given situation with a simple “yes” or “no”.
  • Presenting a logically sound argument (IE: one that is logically true under all circumstances) will not always result in an argument that is factually true.

My hand is starting to cramp, so I will end this screed with a few closing statements.
Why do so many intelligent people have difficulties with symbolic logic? I think the answer is fairly straightforward. SL simply isn’t intuitive for most people. Therefore, if the primary function of SL was originally to serve as a God-designed, teen genius delivered model of human intuition, it does a pretty terrible job. I would also argue that the more naturally intuitive and rational you are, and the more competent you are with macroscopic systems-level problems, the more difficulty you will have trying to force your natural thought processes to conform to this rigid formal framework. Big concepts don’t always break down neatly into bite-sized binaries.

George Boole changed the world. He did it without a degree. In the age of computers and rapid access to information, the George Booles of the world are left out of the dialogue entirely because they lack easily identifiable credentials. That is why most graduate students come to school. They need your paper to validate their academic ability. They will conform if you instruct them to do so, as the penalty is failure and with it the lack of crucial external validation. I am no better, though I will say that I have a much broader perspective on the importance of this program and my role within it. I do not consider this impromptu essay a cop-out or a hail-Mary pass. It was a conscientious decision to tell you that I have objections and that I am fine with whatever happens from this day forward. This degree is vanity. Because I am aware of it, I am free to think for myself. (See how powerful “I am” can be?)

Carl Sagan’s widow once said (mind you, I’m paraphrasing because I don’t have the text in front of me), “Science can’t give you absolute truth because it’s a permanent revolution, always under vision, and only capable of providing successive approximations of reality.” We can do a lot of very interesting things in AI. I just don’t think that settling for something that is merely rational is nearly enough. We once had such lofty ambitions. We can do better than expert systems and semi-autonomous cars and planes. Those things are neat, but there has to be more to this field than simply aiding big businesses and the military. Resting on what is known seems rather against the very nature of scientific innovation. As an academic discipline and a science, I have to think we can do better than this.



As you can probably tell, I took full advantage of the entirety of my blue book plus its back cover as well as the two hours allotted for the final. I gave it a quick read-through, closed the cover, and turned it in. I felt such massive relief! I literally laughed (softly, to myself) the entire way back to my car. I came home and brain-dumped my essay for your enjoyment. Now if you’ll excuse me, I’m going to drink a little wine and watch Red Dwarf! 🙂


PS: I don’t feel guilty for reducing George Boole down to such a negative description in my essay, IE: calling him an “uneducated teenager.” He was an autodidact without much in the way of formal education. I also think that fact speaks to my overall point.

PPS: Please feel free to contact me if you are recruiting students for an upper level AI or Cog Sci program. I would enjoy an opportunity to go to a more progressive school!


My name is Heather and I am a wig addict.

Click to subscribe to the CysterWigs blog in your favorite reader app

Originally Posted on September 09, 2013 by Heather Hershey

Hi wig lovers! This is my personal blog. Since every aspect of CysterWigs is an extension of my love for wigs, I thought it might be fun for our customers to see some of my personal blogs in addition to the ones related to hair.

I plan to talk about a lot of things: my latest diet, the constant battle with PCOS, grad school, psychology, computers, and anything else that pops into my brain.

I don’t want to be another cold corporate entity that takes your money and stands for nothing. This is a small operation. Because of that, I have the ability to stand for something. I don’t have to be generic to try to appeal to everyone. I will, however, try to make well researched posts so not to offend or mislead.

So, I hope you enjoy my blog. Please share it with anyone who might enjoy it!

– Heather

Confession: I am an introvert.

Click to subscribe to the CysterWigs blog in your favorite reader app

Originally Posted on September 09, 2013 by Heather Hershey

It’s official: I am definitely an introvert.

Graduate school has put the issue to rest. My assistantship has doubly reinforced it. I get invited to parties and get-togethers all the time and have, as of yet, only gone to a few. Last week I went to watch someone’s band play for the first time and found myself reading an article in the NYT on a barstool instead of engaging with the drunks. When it was all over, I couldn’t figure out a way to my friend near the stage through the sea of inebriated coeds with minimal awkwardness…so I bolted.

The stange thing is  – I’m not shy.

This is a seldom -noted aspect of introversion that a lot of introverts report. We can be shy, but most of us are only moderately adverse to social situations in general. The real defining trait of an introvert is the loss of psychic energy during interpersonal social exchanges. We’re the ones who think creating and maintaining a social persona is hard work (re: “emotional labor”), unlike extroverts who tend to find maintaining that social façade an energy-giving creative outlet and a generally good time.

I’m a performer. Public speaking and singing are EASY for me. I shine in social media exchanges and on YouTube. These are obviously highly impersonal exchanges and less reciprocal than the give and take of an in-person conversation. Being larger than life in choreographed, anticipated bursts takes much less energy than being “me” one-on-one for long stretches of time. It takes a lot of work to cork the bottle containing my jittery over-thinking.  I am not very good at doing this while anticipating the moods of others and can get easily burnt out by the process.

I can be quite a jovial, charismatic creature when in small, familiar groups of people in comfortable settings. Large crowds are exhausting and the incessant overstimulation thrown at the undergrads around my college campus makes me want to burrow under my bed with a 500+ page book about the biochemistry of human metabolism and the neurology of the enteric nervous system and never re-emerge.

I hate small talk. I would much rather hear about your subjects of study or something that matters to you. I want to hear about things that make you YOU. YOU are interesting. Random trivia, not so much.

I don’t have ADHD. I am incredibly uncomfortable living in an extroverted paradise of neon ADHD-ticking delights and the myriad accompanying voices vying for attention.  I just want quiet. Or some really good music. The point is I wish I could be without some of the noise and distraction.

Most of my friends think that I’m extremely extroverted. I don’t think that’s true.

I remember one time in eighth grade when I made myself go to a party for an acting group I was involved in. I hid in the stairwell while the other kids had fun. I cried because I wanted to go home, yet I refused to call my mom. You see, I wasn’t crying because I didn’t like the other kids; I was overwhelmed and didn’t know how to navigate these tumultuous social waters. I was frustrated for wanting to hide and equally angry at myself for engineering a social situation in which I would have to feel like this.

I was a painfully shy kid. I force myself to act in a more extroverted fashion – even when it is painful – and over time this behavior has allowed me to build up the skill set necessary to survive in an extroverted world. It is internally difficult for me to reconcile this behavior with the notion of somehow being a little less of myself in the process. At the end of the day, though, I am still ME.

I am not a social butterfly. I just play one during the day time. The chronic discomfort means I’m growing. In 2011, IBM’s Ginni Rometty said, “Growth and comfort do not coexist.” I guess this is something most of us know intuitively. I know it’s true through my experience. I would be in pretty sorry shape if I didn’t have the gumption to take on these challenges proactively. I intentionally put myself in positions that require heavy social interaction in the hope that I will rise to the occasion.

If you are an closeted introvert like me, don’t worry. We’re in this together. And if your supersmartsuperbrain is pondering what I’m pondering, someday we’ll take over the world.


PCOS is the worst.

Click to subscribe to the CysterWigs blog in your favorite reader app

Originally Posted on September 09, 2013 by Heather Hershey

PCOS sucks and doctors should definitely know more about it by now.

You know there’s something wrong. You know it’s diet related. The frustrating thing is the lack of cause. Did the chicken come before the egg or vice versa? In other words, does your innate desire to accumulate sexy, socially-desirable body fat  </sarcasm font>  magically give you PCOS or does the PCOS incorporate metabolic agents that make you gain the fat to begin with, like a Crisco-laden canary in the coal mine? The difference between the two paradigms is vast, as one implies that it’s YOUR FAULT you’re sick and the other makes it seem like nature is playing some sort of demented trick on you.

Even when doctors sympathize with us, they still fall back on out-dated dietary training from their time in medical school which says that fatty shaming is the best mode of operation even though we’re too lacking in will power to follow through with meaningful diets. I mean, how dare we regain hard-lost weight? We obviously struggle with our weight just to torment our physicians. </sarcasm font>

I watch my diet meticulously yet the weight just trickles off. I often get really frustrated about it, but like you I just keep soldiering on.

For now I will continue to plug away at things. I am reading “Good Calories, Bad Calories” by Gary Taubes and he makes some wonderful arguments  for why the healthcare establishment is failing us by sticking to a patient-blaming, paternalistic paradigm that is rooted in some fairly dubious dietary science.  It’s a great book. I want to pass along the name of it to any of my Cysters who think this might be useful on their own struggles with this insidious syndrome.

I love you guys. I wish you all the luck in the world on your journey.