Wednesday, August 29, 2012

“I swear sometimes they’re watching me.”

For quite some time now I have heard that Katniss Everdeen is a strong female character, which is quite a perplexing rumour. It is one of those inexplicable misconceptions that annoy people who do not feel the same. Sure, the books were reasonably good (not amazing, not bad), but the protagonist was one of the weakest I’ve encountered so far.

To those that don’t know who she is, read (or watch) The Hunger Games, but I suggest you don’t do both, the movie will be quite ruined. She is the protagonist and, fair warning, this post will be rather full of spoilers.

Let’s go through what we know about Katniss. She comes from a tough neighbourhood where her family is one of the better off (her mother and sister are respected as the local healers, Katniss as the contraband meat supplier). She spends her time running around the local forest with some dude she doesn’t fancy but who fancies her. She does not like violence, yet ends countless lives on a daily basis, which makes her a hypocrite. Maybe she only wants to kill those that don’t fight back, which would somehow be even worse. She stands up for her sister when Primrose is picked to go to the Hunger Games. So she values someone else’s life above her own, maybe she is even suicidal (by that point it is painfully clear that she does not want to live how she has lived so far).

Going to the games she has constant trust issues concerning the other tribute Peeta and her mentor Haymitch. She hits borderline depression but gets slightly better when her dressing specialist makes a few pretty dresses for her. So she is insecure, but that can’t be too rare among fatherless teenage girls. She confides in the person she thinks will die soon. Once at the games, she pretty much dreams through the starting signal and ends up getting only one package or equipment while almost getting herself killed. The first person she has a confrontation with almost finishes her and then gets killed by a third party. After a little wandering she finds out her co-tribute has teamed up with her competition and is willing to kill to survive. She takes it as a huge betrayal because she hadn’t thought of making friends with other tributes, she wanted to go solo. She isn’t very bright. She climbs a tree and waits there until yet another tribute Rue signals her to look up. Without outside help, she could not figure out that she could use her surroundings in her favour. She definitely isn’t very bright. While stuck up in the tree she hears Peeta tell his friends to stop chasing her, thus protecting her. She ignores it, although by that time, Peeta has made it obvious he has feelings for her. She drops a nest, thus killing some inept tribute and makes off with a bow. She ends up hallucinating and requires even more assistance from Rue. Then she concocts a plan that gets Rue killed while she uses a bow where a rock would be more suitable. Not the brightest star in the sky. The first thing she does right is that she picks some flowers and gets some bread for it.

Hopping forward to the cave part of the story, she acts very on/off in regard to her and Peeta’s relationship. Sure, she keeps him alive, but only just. After countless hints and multiple people blatantly telling her that she is being shown on screens over the country, she finally figures out on a rare moment of genius that the people watching might actually be interested in what they are watching. While getting medicine for the poor lad, she ends up almost killed by some other girl, who happens to be so loudmouthed that accidentally not only saves Katniss, but also allows Katniss to retrieve the medicine without further danger. She got lucky that her ‘trained’ opponents are immensely stupid and honourable ‘killing machines’. A little before the grand finale she and Peeta manage to off yet another tribute by ignorantly picking poisonous berries. They got stupid lucky that a hungry dude did not eat some berries while picking them. In the end, Katniss yet again values somebody else’s life over her own.

It gets a lot worse as the story progresses. Important people around her start carrying around mockingjay pins, watches, whatever, and they make sure she would notice these objects, as if it was a sign or symbol or something. She remains completely oblivious to the constant mockingjay fandom, even though she knows fully well that mockingjays were a humiliation for the government and hence are in very poor taste among the upper class. Pretty much every reader figured out there was a group of rebels in the upper echelons of power ready to help her a few hundred pages before someone told her. Heck, she didn’t even get it when some of her most fierce opponents suddenly started working to keep her alive at any cost. Even when she gets out from the second arena (a process in which her bow actually becomes somewhat useful inside an arena due to a massive cockup by the other tributes) she runs around more as a mascot than as someone actually useful. And she fails to protect her boyfriend and her family.

All in all, she created a spark she did not mean to by being ignorant, survived by being helpless on her own and became a success because other people wanted that to happen. She had no regard for ethics or safety (other than when she climbed trees) and apparently thinking was not her strong suit. Note that I am not criticizing her education, but her ability to understand painfully obvious clues and make the simplest conclusions based on uncomplicated known facts. Now how is that a strong female character? This is a serious question that expects an answer, with proper argumentation would be best.

She created a spark but the fire would’ve been better off without her. A mascot is almost as good as a martyr.

 

“There are only two ways in which we can account for a necessary agreement of experience with the concepts of its objects: either experience makes these concepts possible or these concepts make experience possible.”

Immanuel Kant

Sunday, August 26, 2012

The best lies contain a hint of truth.

Americans fog things up. This is especially true in the Samsung v. Apple case, I quote[1]:

“The jury in San Jose, California ordered Samsung to pay Apple $1.05bn (£665m) in damages. […]

In recent weeks, a court in South Korea ruled that both technology firms had copied each other, while a British court threw out claims by the US company that Samsung had infringed its copyright.

But the year-long US case has involved some of the biggest damages claims. […]

It deliberated for less than three days before coming to a unanimous decision, rejecting all of Samsung's claims and upholding five of Apple's allegations, including:
Some of Samsung's handsets, including its Galaxy S 4G model, infringed Apple's design patents for the look of its iPhone including the system it uses to display text and icons
All the disputed Samsung devices had copied Apple's "bounce-back response", which makes lists jump back as if yanked by a rubber band
Several Samsung devices incorporated Apple's facility allowing users to zoom into text with a tap of a finger

Apple had wanted $2.5bn in damages. Samsung had sought $519m.”

Note that the phones on the left are using Windows, the phones on the right are running Android.

In short, Apple is sued Samsung because of Android (because the interface is as annoying as an iPhone’s), large screen and rounded corners. There were minor squabbles about the hardware as well, which were exactly that – minor squabbles. South Korean court said Apple and Samsung copied each other and ordered them to pay each other (Samsung had to pay about 12,000$ less) and banned the sale of the new Samsung Android devices and Apple’s iPhones and iPads in the country, British court dismissed the whole case and American court ignored Samsung and gave Apple victory – Samsung has to pay over a billion dollars to Apple, no bans on sale. In fact, Samsung had a reasonable defence – Apple was not the first on scene with the design and hence does not have the patent rights, iPhone’s design was inspired by SONY’s designs.

“However, Samsung was not permitted to present evidence of this - including a filing showing Apple designs incorporating Sony's logo - because it had presented the documents at a late stage in the legal process.

The South Korean firm did take evidence from the creator of a mock-up tablet computer featured in a concept video produced by a newspaper company in 1994. The designer said he had subsequently talked to Apple about the idea.”

To explain this even simpler:

Except that instead of Microsoft it is Samsung and both parties resort to court.

 

What baffles me about this whole ordeal is the results of the three courts. One punishes both, one goes Swiss, and it happens to be the American one that gives an overwhelming punishment to Samsung. The laws are pretty much the same, the cases were extremely similar, the evidence brought out must’ve been pretty much the same as well. And yet, three very different decisions. I, personally, like the South Korean court’s decision to ban the sale of the devices that infringe patents as it was the sensible thing to do. The products will continue to infringe the patents even after the trials, Samsung probably won in that respect – buying the rights for Apple’s patents on a typical smartphone design would probably have cost a lot more. Banning the products, on the other hands, is much more than a mere billion dollar slap on the wrist for both companies for such tomfoolery. It would prevent further bickering court cases, punish the guilty severely, force innovation and clean the market of generic products. As for Android, that thing really needs some working over. For something based on Linux, it is an amazingly sloppy piece of work.

Sleepidy-sleep-sleep.

"Tyranny, you say? How can you tyrannize someone who cannot feel pain?" Chairman Sheng-ji Yang

Is it not unethical to ‘torture’ people or being that don’t know they are being ‘tortured’? The reason I say ‘torture’ instead of torture, is that ‘torture’ comes in many more different forms – withholding the truth about a situation from someone who does not know it is analogous to introducing the concept of pain to someone who has never felt it.

We have some kind of innate desire to teach others, to share information and experiences. For some stupid reason we want robots to experience emotional response, feel compassion, maybe even love. We want to teach them to be human, until someone dears ask ‘is that really a good idea?’, because then we start thinking of the pain that emotions cause. This has cause numerous works of fiction dealing with robots emulating rage or anger and going on robot rampages. Then again, the idea of ‘robots no longer need us’ has done the same…

And it does not stop with robots – we as a society want that every person would be able to feel the same. If there is a very apathic person, we try to change that. If there is someone who can’t feel certain emotions, we stick a label on that person. We want everybody to be like us, to feel and understand as we do. Even when we know for a fact that we are far from perfect.

Sure there are those who don’t want everything to be familiar, some prefer life to be a little more interesting, with beings having different attributes and capabilities. But this is generally caused by the familiarity of difference – if you understand how someone thinks, you really don’t want to give it emotions to screw everything up. That someone would no longer be predictable or, therefore, controllable. And someone who discovers a strange thing that affects that someone can be a mighty dangerous thing. Finding emotions clouds judgement, clouded judgement leads to pain (mistakes), finding pain causes suffering (consequences of mistakes), suffering to hate and so on.

This is why for some situations there cannot be a single solution – every situation is different. You can ask what to do if someone is being used for someone else’s benefits and that someone doesn’t know about it, but without proper analysis of the whole situation, you cannot know the answer. Would it help if that someone knew? Could you stop it? More importantly, should you stop it? Is it even somehow bad that a person is being used for somebody else’s benefit without his or her knowledge? Not really.

 

You might argue that lack of pain and the lack of knowledge are too different to be analogous. But pain, in essence, is a concept of the mind, a result of the interpretation of signals. Without the concept of pain, a person does not have the knowledge of pain. Without the concept of him or her being manipulated by someone they trust, the person does not have the knowledge. Hence, you can manipulate someone who does not know he or she can be manipulated in that way and you can tyrannize those that feel no pain. But that is not necessarily a bad thing. What you think is best for another surprisingly often is not.

Wednesday, August 22, 2012

The greatest joy for a geek is to do the impossible.

“The popular stereotype of the researcher is that of a skeptic and a pessimist. Nothing could be further from the truth! Scientists must be optimists at heart, in order to block out the incessant chorus of those who say "It cannot be done."” – Provost Zakharov

Often people tend to confuse philosophers with scientists for some inexplicable reason. Philosophers are the ones that keep thinking and theorizing, hardly ever doing something. This is why there are very few philosophers who do it full time. It is generally a sidegig, a hobby. It is the supporting structure of scientific innovation, it is simply one way of pondering about the question ‘why?’. In that sense, philosophy is a lot closer to religion than science (though it can be debated that science itself is a religion). But there is quite a difference between the three.

Religion is a way of explaining how, it generally does a poor job at explaining why. It teaches us how the world was created, how laws originated, even how we should live. The only ‘why’ explanation appears to be bad karma – disobey and you will suffer. It is amazingly conservative, albeit most people ‘misinterpret’ it. That is to say, there appears to be no common interpretation per religion. Sure, there are similarities (all Christians believe in the ten commandments), but detailed interpretations differ (some of them believe in killing in the name of religion anyways).

Science is also a way of explaining how, and doesn’t even try to explain why. It teaches us how the world is built, how it functions, and how we can make better use of it. Even how to create better laws and how we should live. Science, however, falls into two categories, one changes slowly, one changes nearly every day: "There are two kinds of scientific progress: the methodical experimentation and categorization which gradually extend the boundaries of knowledge, and the revolutionary leap of genius which redefines and transcends those boundaries. Acknowledging our debt to the former, we yearn nonetheless for the latter." (Provost Zakharov) The more interesting one tends to be the ever-constant innovative spirit[1] of science – every day new theories are worked out, proven, research done, experiments carried out. A lot of work goes into it and most people never hear of it, the results are published in science publications limited to certain narrow branches of science which are commonly exclusively read by specialists of the field. Some discoveries and theories make it to the public media where they are twisted and bent so far very little of them remain far enough for anyone to actually understand the discovery or theory. The slower side of science is also conservative, it classifies what should be made known to people, what should be taught at specific points in a person’s educative progress, which theories correlate with other previously tried-and-true theories and which theories are rendered obsolete. The categorization and classification process is not an exact science and many important discoveries can lie dormant without anyone learning of them for quite a while.

Philosophy rarely discusses the question how, but it almost always deals with why. Why you should think, why you should consider something ethical or non-ethical, why should ethics matter, why matter exists, why existence itself is so bloody important. Sometimes it tries to teach people how to live, how to achieve enlightenment, how to experience true joy, how to tell love apart from other emotions… but unlike religion it does not include threats. Instead of telling people to do something because otherwise they will go to hell, lose a loved one or a limb or a whatnot, philosophy tells people how to improve their lives on a mental or spiritual level, because otherwise they will continue to live life like they have been for decades, which is not that bad either. On the whole, it is a lot friendlier than religion and more open than science – everybody can practice it as it doesn’t require any specialized base of knowledge. The first thoughts and theories are likely to coincide with previous theories, probably by famous people who have been dead for quite a while.

This is actually what intrigues me about philosophy – the first things to ponder about are probably beauty and ethics, the first results match the theories of the first great philosophers. As a person keeps thinking about odd stuff, he reaches the problems of education and reasons for having a state for a body of people, sometimes hitting Buridan’s ass on the way. This leads to questions of free will and perception while figuring out that ‘we are made of space stuff’. It is a person’s own journey through history without being aware of it, the development of the mind of a human traces the evolutionary development of philosophy as a whole. And, as biologists may have noticed, it is quite expectable due to previous observations of other evolutionary processes where a single goes through all the steps that have been necessary to create the single.

Science, religion, and philosophy have certain similarities, but due to their different natures they cannot be thrown in the same pot as equals. Religion is the input, science is the interaction, philosophy is the output.

Sunday, August 19, 2012

Wait it out.

Euthanasia is an ethical dilemma between free will and empathy. Sure, it is not the way most people would phrase it, but that is what it boils down to – euthanasia is supported by empathic people who understand that in cases of imminent certain and painful death, it is not always justified to stretch the pain out so that the person would suffer. The end result (death after torture) does not change, only the amount of torture does. But since the person is often considered unfit or unable to make the decision oneself, others are given the chance to choose instead. ‘Others’ being relatives (in case of allowed euthanasia or DNR) or the local government (in case of fixed course of action if euthanasia becomes an option). Those that oppose euthanasia tend to stress the lack of free will – the person in question usually does not have a say in the matter, which causes someone else to ‘play God’, meaning decide when the person dies.

There are a few nuances about euthanasia, such as the possibility of misdiagnosis, experimental treatment, miraculous recoveries, etc. In essence, the situations where avoided euthanasia would allow recovery. After all, medicine is not an exact science, surprisingly much of it is a sophisticated method of trial and error – there are no surefire cures that work perfectly for every single person and there are diseases that are nearly impossible to diagnose before an autopsy. The high rate of successful treatments is owed to its sophisticated nature and centuries of experience – if three objects all look like ducks, quack like ducks, walk like ducks, then they probably are all objects of the same nature – ducks. Naturally a few odd swans happen to come along once is a while but they are more of a rarity.

Which is why in controlled situations where the patient is obviously slowly and painfully passing away, euthanasia is quite a charming option. In fact, it is very similar to the problem of ‘who wants to live forever’ – if life means constant torture, inability to do anything you want, unbearable pain… do you really want it? The main difference is that when you have to decide whether you wish to live forever, you make a decision about yourself. In the case of euthanasia, it is usually either you make the decision about somebody else or somebody else makes the decision about you (assuming you are involved).

It is important to bring out that attempted suicide is generally illegal and not thought of very fondly. Actual suicide is a bit harder to classify ‘illegal’ but that is also considered rather cowardly and humiliating. In other words, if the person is terminally ill with no hope of recovery, suicide (or the attempt thereof) would definitely be considered a really bad decision, a person is supposed to face his devils, not escape from them. And that puts euthanasia in a rather dim light – if the person in question should not decide to die, how can anyone else make that decision for him? Curiously enough, if mental torture leads a person to end his life, it is considered an atrocity. A person suffering from long-term physical torture (disease) is considered an atrocity. Therefore, suffering is demeaning, but ending the suffering is even worse. You’re welcome to choose which is the better of the two evils.

 

Quote from BBC about the Czech Breivik-sympathiser: “Neighbours described him as a shy and polite man, although there were complaints after a few minor explosions.”

Friday, August 17, 2012

“Earth is the cradle of humanity, but one cannot live in a cradle forever.”

“I want that you should take care that the plague should not infect you—not the Black Plague, but the plague of Skepticism so fashionable among Wilkins’s crowd. In some ways your soul might be safer in a brothel than among certain Fellows of the Royal Society.”

“It is not skepticism for its own sake, Father. Simply an awareness that we are prone to error, and that it is difficult to view anything impartially.”

“That is fine when you are talking about comets.” “I’ll not discuss religion, then.”

-Neal Stephenson “Quicksilver”

Times change and we change with the times. As Sir Harold Kroto said, he used to know every little detail about the process of taking a photo, as he carried each step out himself. Nowadays we only press a button and everything is done for us, we are not intimately aware of the process. Due to the rise of digital technology, we no longer know ‘how’ things work. Sure, there are commands and electrical signals and capacitors… but we don’t know the underlying architecture nor the algorithms built on it. Often enough we don’t even know whether the thing we are using, may it be a car, a computer, or a speaker system, uses actual programmed commands to carry out its function or simple signals are enough. We no longer need to.

But a lot of magic is lost in the process. You know what I mean – the reason you’d rather read a book instead of watching a movie about it requiring less time, the reason you’d rather go see a live show than listen to a studio recording with superior quality, why you’d rather go sunbathing than visit a solarium. The experience is different. Hell, if I could, I’d use a dedicated writing station for satisfying creative juices. Innovation is crucial and pretty much everyone enjoys it. They enable us to have new experiences while locking away our old ones. This is the purpose the Internet Archive, Project Gutenberg, DOSBox and many others exist, to give us access to those old memories and experiences. But as much as one can try, some things simply disappear. Until we are surprised to see them once again.

Wednesday, August 15, 2012

Technology allows for amazing innovation and incredible stupidity

Windows 8 is coming, there is no denying that. And it will make many people turn to Macs and Linux, with Linux being the more probable option. You may wonder why would Linux be more favoured over MacOS, as it has only a tiny user group, a lot of people haven’t heard of it and their new user interfaces are quite bluntly horrid (GNOME, Unity and KDE certainly are). However, once you have bought a new PC and discover the user-unfriendliness of Win8 you are hardly going to go out and buy a new computer as the old one is most likely not going to be refundable. This means a change of OS is the likely option. Although Hackintoshes (PCs with MacOS installed) exist, installing it is a more gruesome process than getting a Linux (that even has a Windows installer).

Unfortunately the solution is not as simple as getting rid of Win8 and installing a Linux (or the awful MacOS for that matter) to replace the Windows as Microsoft has decided to take note of Apple’s infamous marketing strategy – let nobody do anything they might actually want to do. This means ARM processors[1] that are incompatible (or hardly compatible) with our familiar Intel-based processors (the AMD and Intel processors pretty much every computer uses, except for mobiles and Apple tablets. Yes, this affects the tablet version, not the computer version, but it is annoying for anyone who wants to use their newly bought shiny tablet.

So it has been declared that the switch can be difficult and switching options are rather limited. However, why the switch in the first place?

As it was mentioned already, Microsoft is imitating Apple’s strategy that limits the user and bites profits for the company. This applies to all of the content published for the new OS, but applications are the most affected type of content. Even if you could settle for the idiotic user interface (now named Modern UI because Metro caused legal trouble) you are hindered by quite a few other things. The tablet version (the one where the interface is actually of any use) allows installation only via the built-in store. That means no third party applications or anything developed by anyone who does not want to give Microsoft a third (actually 30%) of their earnings[2]. And to prove that the Modern UI is one of the worst user interfaces we’ve noticed lately, I present this proof:

The problem does not end with that – allegedly the task manager does not simply kill processes, it starts checking for solutions. Sure, that does not sound all that bad but consider that it will do so without warning, without any option of turning it off. And it will do so, hogging some resources, every single time you wish to kill a process (which, if you are trying to use older applications or you are stuck with Win8 before the first Service Pack, is pretty likely to happen surprisingly often).

What bugs me personally is that they still call it Windows.

This has barely anything to do with windows. To clarify, these boxes are called ‘tiles’ and all applications within Modern UI are either full-screen or, if you really want to, two applications can go side by side filling the entire screen. No more resizable windows, which will make people with larger screens or multitasking abilities pretty furious. You can only pick up to three of the following at any time: use the intended user interface, watch a movie, talk to people, browse the internet, make notes, write something in Word, check your mail. Unless you choose not to use the intended user interface, in which case you end up with no Start menu. And who doesn’t like going from folder to folder to run each program individually?

I asked why it was still called ‘Windows’ and not ‘Tiles’ (obviously because people are familiar with the name and are more likely to buy it) from a local official representative of Microsoft but he could not quite answer it. Instead, he said that windows are still possible to have, hence there is no conflict.

Just as a curious thing that you might know, Windows was originally named after the windows you could open, it was a revolutionary idea – everything before it was full-screen. It was what made it so different, so great. With the new Modern UI, Microsoft has taken a step back to its roots – the main user interface features no more windows, as DOS did not (Windows was actually an interface that made the DOS user interface better, not a standalone product, that changed with the 3.x series of Windows). A further proof that Microsoft has reverted to the old days of the 80’s is their logo:

With this, I finish this informative rant about how Microsoft has stepped into a blunder yet again. This is not surprising at all as Windows tends to flourish every second edition (95 was not very good, 98 was praised, ME had tons of bugs, XP was magnificent, Vista was hated, 7 was welcomed). Thank you for reading and have a good day.

Monday, August 13, 2012

“I can stand the torture if they can stand the screaming.”

“Darkness is Death’s ignorance, and the Devil’s time.” – Godfather of Soul

What is a soul? Is there such a thing? Or is it an old placeholder like God?

Since science is based on empiric observations, and the soul has been generally linked to and observed with neurological activity, one could easily define ‘soul’ as the relatively unique personality or personal mental traits belonging to an individual. But as cool as neurology is, this is dry and boring, let’s go deeper.

There is one extremely important bit in the previous point – a soul is for an individual. It defines who we are by our personalities and values we hold dear. SG-1 even went the extra mile by saying that a soul can exist indefinitely as pure energy, therefore not needing a rotting body. In SG-1, a soul could actually recreate its body. In common fiction, a soul tends to retain memories as well as personality, emotions, etc. This is particularly curious as a soul appears to be immaterial, thus it has no obvious way of storing all that information.

A solution to the data storage problem is to limit the soul to a particular body. This would not explain all the out-of-body experiences we have throughout our lives (the moments we see ourselves from a spot about 5-20 metres away doing something or going somewhere that only happen for mere moments), but it would solve the data problems – everything can be accessed from our memory. This comes dangerously close to the neurological explanation as limiting the soul to a body, and therefore a brain, brings up the problem of magnetic coercing and temporary changes in personality and judgement due to aimed magnetic pulses hitting certain parts of the brain. A simple enough problem that leads to a dead end on our quest for the soul.

Therefore we should assume that a soul is not limited to a brain, therefore not merely a neurological effect of a specific brain’s structure. In this case, out of body experiences, reincarnations and Saving Hope are quite sensible options – in the first case the soul is no longer contained in the mind, gets out, wants back in in order not to perish, and does. A mind is necessary for a soul, but a soul can inhabit more than one mind during its existence. In addition, apparently new souls manifest and many old souls cease to exist. Reincarnation would be caused by an out of body experience where returning to the mind is impossible, which would mean either one of two things happen. The first being the soul perishes due to a lack of a support structure (mind), the second being that the soul finds a fresh mind, a mind of a newborn. Whether that mind is already inhabited by a soul or not is up for theological debate (as is the question of the possibility of a mind accommodating more than one soul, i.e. a split personalities complex).

In this case, we can exclude the option of memories being transferred along with the soul (or, in the best case scenario, extremely few memories). But a personality can be transferred via soul – it takes very little space and its very existence appears to constitute a soul.

This leads to an opportunity for you – what do you want to believe?  Do you wish to believe that the soul is merely a result of evolution, a side-effect of complexity… or is it something more, something that can move in time and space but requires a relatively stable environment for continued existence?

So, what do you believe?

Thursday, August 9, 2012

“All men are stupid, OK? Men stupid! If you want them to know something you have to tell them!”

For the past few days, during my late night outings, I have laid my eyes upon a sign that is shown a few times every evening:

“Piracy is theft”

Really? Call me crazy but as far as I’ve been around, theft is considered removing an object (or data, as the case may be) from someone’s possession and claiming it as your own. Piracy (in the digital sense) is copying an object in someone else’s possession without consent and then sharing other copies of the same object.

To clarify – theft of a sheep would mean taking someone’s sheep and running off with it. Piracy of a sheep would mean cloning that person’s sheep, running off with the clone, cloning the clone and giving the secondary clones away for free.

You can see the distinct difference between the two – in one case there is only one sheep that the owner loses, in the other there are many sheep and the owner gets to keep his sheep.

Hence, the claim “Piracy is theft” continues of baffle me. Sure, there are certain similarities, but that is like saying all dogs are cats because they have four legs and they are often present as pets.

 

In the case that you, for some reason, have a wish to exclaim “But piracy does not require sharing in the common, everyday meaning of the word in digital context!”, I beg you not to. It will result in merely making you look like a fool. Misusing words is not a tolerable action. Legally speaking, piracy requires sharing. Often enough, that’s what actually makes it illegal – in many countries (for example, this one) it is legal to make copies of digital material (except for programs and databases) for personal use. Sharing them is illegal, including making the first copy without permission is a stretch already. In other words, offering a file without permission is evil and will land you in a special kind of hell, taking the offer is nothing to complain about and you have a decent chance to avoid hell. Unless you have insured a spot there by your usual activities anyways.

And even if you mistreat words there is no denying that in case of theft, the owner loses his possession completely. In the case of piracy, the owner keeps the possession, but the value of the possession is likely to plummet, but at the very least the owner keeps the sheep.

 

Sure, nitpicking is possible – often enough in the case of piracy the first copy is legal and made of the person’s own possessions (a bought CD), which is afterwards shared with other people. Making copies of CDs or DVDs for personal use is not uncommon, since they are generally fragile, it is good to have a backup. Not just that, having a digital copy means simplified travelling – you don’t need to take the disc with you. It may take up some extra space on your hard drive or memory stick (who buys SSDs?) which generally don’t run out very quickly. Unless the person in charge of the data space likes running awful stuff like iTunes or prefers to preserve every single file they’ve ever had. Often enough a simple run of SequoiaView will show what no longer necessary stuff takes up the most space.

 

In conclusion, “Piracy is theft” is a curious, yet false statement due to significant differences in their meaning, no matter which sense of ‘piracy’ you mean (excluding conquering ships). On another note making sure you know the local law for copyright infringement can land you valuable knowledge about what you can do (copy) and what you can’t do (share). And regularly cleaning up your drives is also important if you want to be sure that your computer runs smoothly without any hiccups and that you have enough storage space (for German music, if need be).

Monday, August 6, 2012

“Standing on the defensive indicates insufficient strength; attacking, a superabundance of strength.”

“I believe in Spinoza's God who reveals himself in the orderly harmony of what exists, not in a God who concerns himself with fates and actions of human beings.” – Einstein

God is a byproduct of our search for truth, it explains what we cannot. It is a way of saying ‘it is so because it simply is’. It is no more than a lorem ipsum. Alas, more often than not people have failed to understand the purpose of a lorem ipsum, which is to await redacting. People don’t like to redact God.

What is most perplexing is the huge following a mere placeholder has, it is spread through many different cultures that interestingly enough conflict with each other. Confrontations arise on many fields, from politics and laws to everyday life. All in the name of a mere placeholder. An important and necessary placeholder, but a placeholder nonetheless.

This is the main problem – much too often God stays in the way of true innovation and our search for truth. It blocks the lamp and covers the wall.

Sunday, August 5, 2012

Dear people always leave.

I’ve written about perception before, and I will again, for perceptive mistakes manifest not only in our minds, but in our instruments as well. There is a distinct difference between what we see and what we think we see – one is related to the eye, the other to the brain. We see a myriad of colours and lines, our brains try to make sense of the giant jigsaw puzzle by comparing known objects and shapes, or their appearance at the very least, to the patterns it finds in the signals from our receptors in our eyes. What we perceive is the interpretation of these patterns. But neither our mind nor our eyes are perfect, this is painfully obvious.

“Daniel saw in a way he’d never seen anything before: his mind was a homunculus squatting in the middle of his skull, peering out through good but imperfect telescopes and listening-horns, gathering observations that had been distorted along the way, as a lens put chromatic aberrations into all the light that passed through it. A man who peered out at the world through a telescope would assume that the aberration was real, that the stars actually looked like that—what false assumptions, then, had natural philosophers been making about the evidence of their senses, until last night? Sitting in the gaudy radiance of those windows hearing the organ play and the choir sing, his mind pleasantly intoxicated from exhaustion, Daniel experienced a faint echo of what it must be like, all the time, to be Isaac Newton: a permanent ongoing epiphany, an endless immersion in lurid radiance, a drowning in light, a ringing of cosmic harmonies in the ears.”

- “Quicksilver” by Neal Stephenson

Saturday, August 4, 2012

You could call it a review.

I thought I was a decent enough writer, enjoyable to read. Then I started reading “Quicksilver” by Neal Stephenson and realized that I was but a novice. Even Suzanne Collins shrinks to an amateur when compared to Stephenson’s work.

One of the more amazing feats of Stephenson, aside from the surprisingly detailed descriptions of events, places and people through centuries, is the fictional detail. He has created a detailed lineage that we first learn about in the 1700s (“Quicksilver”) and last post-WWII (“Cryptonomicon”). And by ‘detailed’, I mean summarized life stories for almost every Waterhouse at the very least, more detailed stories are told about the protagonists, often enough Waterhouses themselves. As side-characters we find the fictional Comstock family, as well as a few other made-up characters. But what makes the stories so special is the authenticity – he takes the saying ‘the best lie has a part of truth’ to a whole new level. Historical events - the foundation of MIT and Harvard universities, specific battles during the German Blitz –, places and people – Alan Turing and Isaac Newton are merely a few of the large cast of natural philosophers and great thinkers. To Stephenson’s praise, when these facts about people or events are checked, they apparently check out. This means that extensive research and amazing imagination have achieved perfect fusion to create impressive literature.

The creations of Stephenson are remarkable not just because of the fuzzy line between truth and fiction or the attention to detail, there is a good philosophical side to them – from general topics like the debate between free will and predestination and the existence of a soul to more specific arguments of modern digital data ownership problems. All in all, a pretty well-rounded style of storytelling that deserves a read.

PS. The books also contain tons of interesting facts and instructions that make them entertaining and very educative. No surprise he would be the one spearheading CLANG. When you want to sample his creations, I’d recommend starting with “Quicksilver” (first part of the Baroque Cycle) or “Cryptonomicon”, don’t be daunted by the titles, they are more about the story than about complex algorithms, though occasionally a few of those sneak in.