Sunday, October 28, 2012

I sure hope long quotes do not become a habit.

“Recently I watched a debate concerning the abolition of the ERASMUS exchange program. It is uploaded as a Vbate in two parts (3rd and 4th speeches are in another debate) for all those that want to see it.

The proposing side (who wanted to get rid of ERASMUS) brought out the current economic crisis and the fact that a student foreign exchange program accumulates a lot of valuable money while it does not bring any immediate benefits, which are in great demand during a crisis. They proposed cancelling the program temporarily, "for a short time". While it would bring a short-term boost in liquid resources, cancelling such a large-scale program for a short duration just to reboot it in a year would cause far more hassle and waste than it is worth. Rebuilding such a program is no easy task to accomplish, the current staff would have to move on to other, more stable pastures, and getting them to return would mean even more problems. So most likely the program would lose a lot of valuable resources and effectiveness, not to mention trustworthiness. As a result, cancelling the program temporarily causes way too many problems, and by merely proposing such an idea, the proposition accepted that ERASMUS has to function, it is important.

Another very odd problem they mentioned was 'distractions'. By this, they meant that the independent life of an exchange student brings up new responsibilities, such as cooking or financing oneself, that hinder one's studies in a foreign university. Because every student that has not left their home country lives with their parents, goes to a local university, etc. In such a case, why do we even need dormitories? In any case, a rather poorly thought out argument for exterminating ERASMUS.

A third point was its impracticality - the universities are not equal and, as such, one cannot simply hop between universities without sacrificing pieces of one's education in the process. But if the universities were equal, why hop between them at all? What would be the point in going abroad, if the experience was the same as at home? Sure, certain similarities exist (which is why George Formby could say 'it's no different anywhere'), but that is why it is possible to temporarily attend an another university.

The benefits of temporarily going abroad to another university were already explained in the previous post - it is all about differing somewhat from everyone else with the same education. No two lecturers are exactly the same and no two lecturers teach exactly the same things and/or use the same methods to teach. Not to mention meeting new people, potential coworkers, employers, underlings, partners, advisors, and experiencing a new culture, thus teaching a person how a certain culture functions, making the person more tolerable towards different cultures and nations. On the whole, a very positive experience.

So this is why the proposition's arguments were shallow and weak. Next time let's see if the opposition fared any better.”

 

Again, the quotation marks are here for a reason. Origin: the same page as before.

Saturday, October 27, 2012

Some quotations are short, this one is long.

“The aim of the Bologna process is to make European universities comparable. Sure, accredited curriculums are valid in every country in the European Union, which would indicate that the essential things studied are the same for each subject all over the Union. In most majors, it is possible to go abroad for some time and the courses taken at another university are considered comparable with or equal to the same courses offered at the home university, all thanks to ECTS. This way studying abroad is not simply a method of experiencing a different culture or making friends or acquaintances with potential future colleagues, employees, or superiors, it is an opportunity to receive education that is somewhat different from what everybody else in the home country receives. No two universities are exactly the same as every lecturer teaches in a different way. Some know more tricks to solve difficult problems, some teach a certain methodical approach that others don't. It gives the student an edge in the job market. The final diploma the student receives is equal to other diplomas of the same kind because the courses taken are equal. There are some differences in the attractability of the diploma for potential employers - an Oxford graduate is more likely to get a job than someone who studied in Riga. But both can apply to the same positions and sometimes the less likely candidate is chosen because of qualities possibly unrelated to the alma mater, but rather the personality, motivation, or the first impression of the candidates. This is, as I already mentioned, in most majors.

In certain fields, such as law and medicine, a year abroad is a somewhat more complicated undertaking than in, for example, biochemistry. While the final result of medical school is the same, a chance to start residency, the way the curricula are built up differs quite a bit. Some universities, such as LMU München, have officially divided the medicine major into two stages, 3 years each. In a way, it is like finishing Bachelor's and beginning Master's, but the first three years do not actually give a level of qualification. In most European universities the 3+3 concept is used, but the curricula are not officially separated. The first three years are so-called pre-clinical years, the years when the students must learn the theory of medicine, from the Latin names of all the tiny protuberances on every single bone (sulcus tendinis musculi flexoris hallucis longi, to all those anatomy geeks) of the body to different treatments to complicated illnesses caused by various pathogens. The second 3 years is for practice, rotations, for the student to get personally acquainted with the actual everyday life in a working hospital, to learn how each part of a large hospital operates and how to become a good cog in the well-oiled machinery. The difference between universities is mostly in the way things are taught, but somewhat also in when something is taught.

In 'normal' majors, the student has a number of obligatory courses he/she has to pass and a large number of voluntary courses. The student can usually decide when to take the obligatory courses, sometimes causing them to be ready for a Bachelor's diploma a full year before the nominal study time is over. In medicine, however, the obligatory subjects are set by year. If you don't pass it in the year you are supposed to, you either get thrown out or you take a year off and try again after a year has passed, but you do not get to advance to the next year's subjects. And this  causes the problem with studying abroad. There are surprisingly few cases in which studying abroad works as it would if the major was something simpler, like computer science. Generally, the people who study medicine and do go abroad, do it for a very short amount of time. This way they can still complete all the courses of the year at the home university, but gain valuable experience elsewhere as well. The other option would be to go for a whole year and repeat a year at the home university.

But this causes a rather important question to arise: if all accredited medicine curricula are considered equal or comparable in their results, why aren't the components of them considered equal or comparable as well? It would stand to reason that if two wholes are equal, then the pieces of the whole ought to be equal as well.

It gets even more complicated with residency. While officially, a resident is no longer a student, but an employee at a hospital, the concept of residency is learning through intense practice. In different countries, the duration of residency varies. In Estonia, for example, residency lasts for 3 to 5 years, depending on the specific field of study. Most surgical fields require 5 years. Lately, however, there have been talks between politicians and the medical student union about adding an extra year to the beginning of each residency that would not be field-specific (extending residency to 4-6 years). Some countries, such as Germany, still employ an extra year called Internatur for this purpose, but this concept was abolished in Estonia about 15 years ago. But the result of residency is still the same: one becomes a fully fledged doctor of medicine. So if the purpose is the same, the result is the same, the methods employed should be the same (otherwise the result would not be the same), why are there still differences between the systems in use in the European Union? It would appear that soon an Estonian medical student should do his or her best to start residency in an another country, such as Germany or Finland, as opposed to going there after residency, as appears to be the case right now.”

 

The quotation marks are for a reason. In fact,I strongly recommend you check out this place, as I did.

Wednesday, October 24, 2012

Per aspera ad astra.

Education.

One of the worst things school or university can be is too easy. In this case, those that don’t give a donkey’s backside about studying and those extremely fascinated by the subjects taught end up equal on paper. The results no longer signify the abilities, skills or knowledge of the people whom they are about. The people with great memory (or persistence) look exactly like the people with great processors (who improvise amazingly well using logic and derivation). Even the people who are great all-rounders look precisely the same, when one looks at the results. With such an education, a person’s efforts, no matter how great, are equal to no effort. So why even try?

Unfortunately, this is the case in many places.

It is a similar problem to the one stated in the previous post (the video) – there are very few choices that matter in the field of education. Especially now that education has fallen far behind innovation:

Sure, this is something vaguely similar to scientology, but the point remains. The speed at which new discoveries are made is staggering. It is exacerbated by an odd publication bias. There are papers published about research done without even remotely near the professionalism in mind that would be expected from established scientists (sometimes the tests are carried out on insufficient numbers of test animals, sometimes a control group is missing, usually the conclusions are as if sucked from a straw) and there are many papers on failed experiments that don’t get published[1].

With all this malarkey going on, it is difficult to give modern education a positive assessment. Except that it is not too easy, as long as a person picks a challenge. And that is to be cherished.

Sunday, October 21, 2012

The not-so-secret life of us

In the modern world, large chunks of personal data, especially our life stories, can be found online. That is to say, if you happen to meet somebody new, try binging (or, for that matter, googling) their name and see what you find. You’ll probably find out quite a few of their fields of interests, even the type of crowd they hang with – perhaps they use Google+, perhaps they use Facebook, perhaps they are on LinkedIn. By getting some small details or, in the case of very expressive souls, many small and large details, one can learn a huge deal about a new acquaintance. By learning about the events of the person’s past, you can deduce some sides of a person’s current personality.

In addition to the event-based information that lies on the intertubes, quite a few people blog. Either daily, weekly, or even more rarely, but there is a relatively continuous stream of information about the person’s thoughts, events that matter to the person, news that interest the person, etc. But it is there, and it gives a lot away about the person. The way one matures in time, the way one thinks about life, the reactions to events. In a way, it reflects the author’s soul.

When you meet someone interesting, try binging or googling them. It makes for an interesting read and gives you an overview of the person in question. Naturally, you can’t learn everything online, but it is a good start.

So let us finish with a good old video that stars a person that really reminds me of a bright bloke in Harvard:

via

Saturday, October 20, 2012

Life is like a fountain

A drop of water within it, actually. The drop, as a life, can go in a large number of different ways, each trajectory varying in distance, height, duration. As does each life.

On the way from being pushed out from the pipe to the moment of stillness upon landing, a life meets many others, changing with every influence, sometimes losing a small piece of itself or gaining a bit of others. Sometimes a drop grows, but it always loses energy.

The fall of a drop can create waves, marking the impact of its existence.

The course of a life is not preset from its inception, only one thing is certain. At one point it will lie with the lives that have passed for an eternity of tranquillity.

Wednesday, October 17, 2012

No man or woman is perfect, no brain infallible.

It would appear that scientists have conjured up a new method to find out how to test whether we are living in a computer simulation or in the real world: build a simulation (that would be able to simulate our known Universe) and test different scenarios, events, and see which fail. Those that fail must be the fault of the simulation, if it works in real life. If everything works as in real life, then real life must be a simulation, because currently nobody can build a perfect simulation. Now, one might think that if there is no perfect simulation, then any mistakes that are caused by the imperfections of our synthetic simulation might be fixable. Knowing modern coders, it will take only a few patches to start messing up normal stuff. Each patch eliminates a small bug, but creates a larger ripple that will have to be patched. Sure, it could be possible to create a simulation that is easy to manage with small code churn per update. Unfortunately this is more likely to happen if the simulation was created by specialists of physics, but the coding has to be the responsibility of computer scientists. Thus there is a issue with people. It is quite impossible to convey a large amount of detailed information that does not only have to be memorized but also understood in a reasonable amount of time. The elements of human communication and imagination limit this proposed simulation to the extent that it no longer indicates our status. The fact that something doesn’t work as it should could mean that the simulation is just not good enough. After all, we cannot assume we are as smart as or smarter than whoever created the simulation we are currently in – we have yet to create a simulation so complex, yet perfectly functional.

Saturday, October 13, 2012

“We must dissent.”

It is general consensus that most people are not pleased with the government that governs them. As such, it should also come as no surprise that the government, a body of people generally ungoverned, consists of a huge number of people, few of whom are actually qualified for the job. This has brought to mind quite a few thoughts, small quotations of sorts, that aptly describe the situation.

“To summarize: it is a well-known fact that those people who must want to rule people are, ipso facto, those least suited to do it. To summarize the summary: anyone who is capable of getting themselves made President should on no account be allowed to do the job. To summarize the summary of the summary: people are a problem.” (D. Adams)

The main problem is actually the people who want to govern. Generally these are not the typical educated specialists of a given field, but rather people who are popular with crowds. Sure, they may have academical qualifications, but they rarely get to use that education in the specific field they were educated in. They get elected, they govern, efficiency is low. Those that would fit better, are generally simply not interested in the bureaucracy and power-plays and mind games that go on. Power only corrupts because it attracts the corruptible. Straight shooters get ignored (as was the case with Ron Paul), as they are not willing to make a splash using lies or deception. This is why we can’t have nice things.

Tuesday, October 9, 2012

“Man has killed man from the beginning of time, and each new frontier has brought new ways and new places to die. Why should the future be different?”

So when is the right moment to use nuclear weapons? Sure, I’ve explained the best way to use them as an offensive weapon, but the whole concept of nuclear deterrent is that it can be used defensively. But, as Mr Hacker so aptly put, how can one defend oneself by committing suicide?

Using a nuclear weapon has quite a few problems, for instance the minimum range. Before using a nuke one has to be certain the fallout won’t affect that person, country, institution, or whatever is that ‘one’ negatively. For example, in theory, Latvia would never be able to nuke Lithuania as the bang would be too big – it would be like nuking oneself, hardly a bright idea. In modern times this minimum range requirement causes a huge problem: the nuke has to reach far enough from the position it was fired from before any chance of interception. There is hardly any point in Norway nuking Moscow if the nuclear missile is intercepted above the Baltic Sea or a few miles from Riga or Tallinn. It’s even worse if Finland were to try to nuke Ukraine and the nuke would barely reach the Gulf of Finland before someone shot it down or caused it to detonate. It’s like playing tennis with an extremely high net – sure there is a tiny chance you’ll get the ball over, but more likely it’s going to drop right back down at you. Even if you get it far enough from you, it still has only a small chance of hitting the designated target.

The aforementioned examples are naturally probably never ever going to have even the measliest probabilities of having the possibility of happening, but the concept of nuke-blocking does work. And in a defensive position it is a very real problem as the aggressor is most probably prepared for any possible nuclear launches and is ready to shoot down any nukes before they become a threat to the aggressor. In the worst case scenario, the aggressor has to make a temporary tactical retreat due to scorched (or rather radiated) earth. To those that do not know what a scorched earth tactic is, it is the simple concept of destroying all infrastructure (buildings, roads, pipelines, power lines) while falling back. This makes it more difficult for the other party (the hostile army) to pursue or rehabitate the land.

In the end, it is obviously wiser not to use nuclear weapons, unless one no longer cares about one’s own future – anyone who actually used weapons of mass destruction would either get a taste of their own medicine or get to taste the medicine in different flavours (other models of the same weapon/retaliation attacks). This is partly with the exception of the United States – a country as powerful and large as that can probably nuke a smaller country without causing too severe consequences to the country itself. Unless it strikes at an ally or an another powerful state.

But if it is wiser not to use them, why have them? Well, to put it simply, it is to force salami tactics. Nobody actually wants to confront an enemy with nukes, not in a large-scale war at the very least. So the only option is to go slice by slice, giving the defendant a chance to recover from the initial attack and call upon its allies to kick the aggressor’s backside all the way back to wherever it used to be before the military conflict. Without the power of a nuclear threat, it could be easy to overtake a whole country by storm (or Blitzkrieg), but such aggression would only cause nuclear retaliation if it is possible. The difference is the amount of pressure put on the weakening country and its diplomatic and military options – the more options, the less likely it is that a nuclear weapon will be used. Basically, nuclear weapons are necessary for causing longer wars.

Saturday, October 6, 2012

“Between love and madness lies obsession.”

What is love? Surely it must be something more than what Haddaway sang about.

Well, the feeling of love is caused by the release of certain hormones, one of the most famous ones being oxytocin, the poetically named ‘love hormone’. The release of these hormones is also connected with simple crushes, short-lasting attractions, which can cause confusion between love and lust. Which in itself would not be so bad, if we could be sure that whatever we are feeling was definitely more than a dream.

Neurologically speaking, the signals going through one’s brains when one is awake are indistinguishable from those being fired when one is simply dreaming. Therefore love itself could be no more than something we’ve dreamt up, as dreaming awake is not uncommon. Just think of how the first stages of love are generally described, the feeling of light-headedness, the need for personal proximity, the inexplicable trust, the quickly drawn conclusions about the likeness of the other to oneself… sounds pretty dreamy to me.

One thing we can rule out about love is that it is not a morbus, a disease. It is a (relatively) simple biochemical process that happens with almost every person at some point or other, quite possibly on multiple occasions. ‘Curing’ it would mean disrupting the natural processes that go on in one’s body. Oddly enough, even the trigger of the emotion is pretty impossible to ‘cure’. There have been different trials with different hormones, even artificially creating a jump in oxytocin levels when one is exposed to a certain object (such as a foto) or person, thus training the person to associate that object or person with the feeling of love. These experiments have failed.

So love is a function of our bodies, aimed to find better mates. This is slightly off-target when contemplating about homosexuality (and many different philias such as zoophilia or necrophilia), but these are practically unchangeable psychological conditions[1]. It is triggered by spotting certain details, the conditions of which probably unique to the person spotting. It can drive us nuts, completely destroy our ability to think clearly or about anything else than the trigger. But in other cases, it can cause bursts of creativity, the ‘muse’ effect, it can create apparent self-confidence, often only present when the trigger is not around[2]. It has side-effects that vary wall-to-wall, from drowsiness to hyperactivity, from extreme shyness to active social behaviour, from mental drainpipe to a dream factory. But to know what it is, I guess one has to experience it first-hand.