Science is gay – we all know it but only few acknowledge it. The world is gay, so science, the study of the world, must be too.
One of the fundamental flaws of tertiary education in Ireland is complete ignorance of philosophy in STEM. I don’t know a single STEM student, and I know a few having completed a STEM degree, that have done a single philosophy module, not even one on formal logic.
Materialism and atheism are thus unsurprisingly widespread in an academia mirroring the general beliefs of the society. And just like society, their understanding of the world is based on pathetic comic books and Marvel movies.
With not a grain of sound introspection, our academic elites do not realise that half of cosmology with its multiverse theories is just a bad attempt at metaphysics; firmly believe that multi-dimensional, untestable string theory deserves the label of “fundamental” or any attention at all; or commit world-scale data analysis fraud with their so-called-images of black holes.
If we look at the history of science, from the time of ancient Greeks, there is a clear trend of science getting more and more abstract; and since the late 19th century, more and more obtuse too.
For most of modern history, science was under the heavy influence of Christian idealism, but together with the decay of our Christendom, the Faustian spirit took control over its trajectory. It used to be that one would desire Truth for the sake of Truth alone – looking up to the Heavens for guidance on their path towards Truth, they yearned for perfection embodied in Christ or, in the case of ancient Greeks, well… nothing. This is perfectly encapsulated in the following story:
According to Stobaeus, “someone who had begun to read geometry with Euclid, when he had learnt the first theorem, asked Euclid, ‘But what shall I get by learning these things?’ Euclid called his slave and said ‘Give him threepence, since he must make gain out of what he learns.’”
People with limited cognitive abilities, often believe that the Church as an institution is dramatically opposed to all kinds of scientific endeavours and faith is the antithesis of science-based knowledge.
To give a couple of examples to the opposite: Gregor Mendel, an Augustinian friar, is the founder of modern genetics; Georges Lemaître, a Catholic priest, was the first to propose the “Big Bang theory” and significantly contributed to its growth and popularity. Two examples, both ironic, because mendelian genetics and the Big Bang theory are the bane of modern biology and cosmology, respectively. Nevertheless, the Catholic faith is the source of man’s desire to pursue scientific ideals. As the venerable Fulton J. Sheen put it:
“How could science be an enemy of religion when God commanded man to be a scientist the day He told him to rule the earth and subject it?”
One could thus say that the profession of a scientist is deeply sacred and ought to be performed only by the most trusted and most holy.
With many scientists of the contemporary top echelon moving to the USA during the first half of the 20th century, the European ideal of Truth, which already has been sabotaged by logical positivists, has disappeared completely from the minds of future generations.
Both World Wars and the heavy influence of the capitalistic United States have changed the state of academia for a long time to come. The hard times of the war prioritised above all pragmatism, resulting in sincere pursuit of Truth being derailed and eventually stopped completely.
Since then, nothing of value has been done. We have seen turning of the Western universities into multi-billion institutes, but our fundamental understanding of the universe is essentially unchanged – only newer and newer models based on all kinds of approximation with a worryingly increasing number of parameters added without justification save for maintaining a façade of veracity.
The best example of the above is the following anecdote from Werner Heisenberg’s autobiography: in 1929, he talked with an American scientist, and he mentioned what we now call the wave-particle duality: “We are desperate.”, he said, ”There’s one experiment in which an electron behaves like a particle and another in which it behaves like a wave, and we cannot understand it.”. The American’s response was: “What’s the problem?”.
The duality is not a problem in the eyes of corporate Yankees. Why bother asking “why” if it’s possible to entirely overstep the issues (call it “fundamental property of nature”) and move one.
The behaviour of the electron is known anyway, so it’s enough to profit from it. The theoretical pondering on the fundamental nature of the universe plays no role in American philosophy, because it can’t make your wallet grow. We can see it clearly now when no one has time to contemplate deeply on any matter, not even theoreticians, due to the tempo the capitalistic system forces us all to live in.
Not long before the outbreak of the first World War, Einstein’s relativity theories made waves in the world of physics. Just like the war, it resulted in a series of aberrations and tragedies. It was a clear step away from the intuitive science, overly abstract, filled with unnatural ideas of spacetime and curved space as if we live in four (or more) spatial dimensions. To quote the mad genius Nikola Tesla:
“Einstein’s relativity work is a magnificent mathematical garb which fascinates, dazzles and makes people blind to the underlying errors. The theory is like a beggar clothed in purple whom ignorant people take for a king… its exponents are brilliant men but they are metaphysicists rather than scientists.”
Though the theories were aberrant, they did not have to end tragically. There was a chance to reverse the derangement, but the war changed the power structure in the scientific community putting on the pedestal those involved in the governmental projects.
Making a nuclear bomb is not easy, but neither it requires a genius. And genius was needed to fix physics. Instead of it, however, physics for mathematicians. As Richard Feynman pointed out in one of his lectures, physics cares about very specific scenarios, while mathematics pursues generalisations. Of what use is n-dimensional generalisation if we live in only three? Regardless, n-dimensions is what became important and slowly mathematicians-turned-physicists forgot that nature is what they are dealing with and not Platonic, mathematical abstractions. Without constraints of nature their imagination runs wild causing astronomical damage to science.
“For nothing has done more injury to science than the play of imagination subject to no control, on the part of men who enjoy in the public press the rank of scientific authorities.” – George Ellery Hale
In the end, even Einstein lost the ball and said the following:
“Since the mathematicians have invaded the theory of relativity I do not understand it myself any more.”
At the same time, science became institutionalised. Universities received government funding and became the behemoths they are now effectively creating oligopoly on science. But with the fed money came the need for researchers to justify why they should receive the grants and why they are better than all the others.
To do that, they created a new word “pseudo-science”. The term now is used in the same manner as “conspiracy theory” meant only to discredit “rogue” scientists fighting against dogmas. But even with modern philosophers of science, for example Feyerabend’s Against Method, speaking against this nonsense, it stayed with us to this day. Not a big surprise – no one cares about philosophy.
Rarely does corruption involve handing someone envelopes with cash under a table; usually, it looks like the CIA scene in Wag the Dog. Everyone with average, or above, intelligence knows their role in the society and consciously or unconsciously knows what needs to be done to prolong one’s lucrative career.
Men’s value is solely derived from his usefulness to the community, so he will never knowingly undermine his position and will swear for his life that re-starting that particle collider is really, really important for science. In the age of liberalism, it doesn’t matter that he is sabotaging the progress of the society and wasting billions of taxpayers’ money, what matters is the number of zeros in his personal bank account. Scientists are especially good at justifying their work – it’s literally 95% of their daily work.
Most research nowadays is done for the sake of profit and fame alone. Just like when a philosopher starts making money, he becomes a sophist, when a scientist starts making money, he becomes a marketing campaign manager with the only point of his existence being gathering as much funding for his group’s research as possible (e.g.“Two decades of Alzheimer’s research was based on deliberate fraud by 2 scientists that has cost billions of dollars and millions of lives”.
Particle physics for a long time has had no genuine advancement. Many novel particles have been “discovered”, each having less evidence for its existence than Bigfoot, but at least it prolonged the perception of progress.
The Standard Model (the central theory in particle physics) is so “fundamental” that it has over 50 unexplained parameters. For comparison, Cosmology has “only” 17. No wonder that many particle physicists have been transitioning to other fields of physics – especially cosmology – escaping the consequences of their incompetence.
Unfortunately, just like immigrants coming to Europe, instead of fixing their broken behaviour, they continue in their ignorance leaving everything they touch corrupted.
The discovery” of the Higgs Boson is hailed as one of the greatest achievements of 21st century science and it allegedly explains the origin of mass. Except, as it is with the so-called-images of black holes, the evidence is non-existent, and the “achievement” is based on massive amounts of data analysis which requires strong presuppositions requiring knowledge of phenomena they are trying to prove in the first place, see Unizcker’s The Higgs Fake.
Whether it’s particle physics or cosmology, it’s extremely easy to lie with the amount of data they gather, especially since they rarely release it to the public. Don’t hold your breath thinking that restarting the Large Hadron Collider (LHC) will bring some new, actual insights; it’s just a continuation of the scam.
I hope I don’t have to say this, but the peer-review process doesn’t change anything. Realistically, it’s another type of appeal to authority, or, as Auron MacIntyre would certainly say, appeal to the managerial elite.
To say that [peer-review process is flawed is not even that controversial these days. Still though, no one is willing to step away from it, maybe because people who would oversee creation of a new system are the ones who benefit the most from its current state.
The process is inherently weighed against new breakthroughs and against ideas coming from “the outside”, whether it’s laymen or professionals in different areas. Another example of implicit corruption present in the system.
Moreover, with increasing popularity of science fiction due to our feminised Reddit culture, many losers do science to bring to life silly, unrealistic ideas that they read about in their favourite comic book. If we look back, even in the early 20th century most scientists were higher class, respectable gentlemen; the profession of a scientist came with a huge amount of respect and responsibility.
Nowadays, it has deteriorated to cringe nerds who can’t even sit straight.
Too many good men have walked away from science because of the culture it’s surrounded with. Soon enough, the community will be wholly entrenched by delusional soyboys and feminists producing nothing but pure poison to a healthy mind. Soon enough, nobody will care about science. And it’s all because of those shitty science fiction and superhero comic books. Oswald Spengler talked about it:
“The tyranny of the Reason — of which we are not conscious, for we are ourselves its apex — is in every Culture an epoch between man and old man, and no more. Its most distinct expression is the cult of exact sciences, of dialectic, of demonstration, of causality. Of old the Ionic, and in our case the Baroque were its rising limb, and now the question is what form will the down curve assume?
In this very century, I prophesy, the century of scientific critical Alexandrianism, of the great harvests, of the final formulations, a new element of inwardness will arise to overthrow the will-to-victory of science. Exact science must presently fall upon its own keen sword. First, in the 18th Century, its methods were tried out, then, in the 19th, its powers, and now its historical role is critically reviewed. But from Skepsis there is a path to ‘second religiousness’, which is the sequel and not the preface of the Culture. Men dispense with proof, desire only to believe and not to dissect
“The individual renounces by laying aside books. The Culture renounces by ceasing to manifest itself in high scientific intellects. But science exists only in the living thought of great savant generations, and books are nothing if they are not living and effective in men worthy of them. Scientific results are merely items of an intellectual tradition. It constitutes the death of a science that no one any longer regards it as an event […].” – The Decline of the West, Vol. 1
Probably the most ubiquitous feature in works of science fiction is the Artificial General Intelligence (AGI), since it’s a good basis for gloom and doom dystopia stories. The area of Artificial Intelligence (AI) has been enjoying a resurrected attention for a couple of years now and the topic of AGI is quite popular too.
From Musk to Gates there’s more than enough voices predicting the coming destruction of mankind by machines. The problem with such statements is that it took millions of dollars and state-of-the-art computers to beat an Asian kid in Asian checkers (sometimes also called Go). It has only 2 rules. Yes, the greatest achievement of AI is beating a nerd in a game with literally 2 rules.
What a victory! To be fair, it has 361 squares, and the number of possible positions quickly gets out of hand. Nevertheless, the problem is rather easy having clear rules and a finite number of possibilities; it’s just a matter of optimisation and computing power – unlike self-driving cars.
Many years and many billions have been spent on the development of self-driving cars and the best we can get is a Tesla that can drive straight if you fall asleep behind the wheel. It will still break randomly or just crash and kill you, unless it will catch fire and kill you before it crashes, of course. Great achievement!
Frankly, fully self-driving cars won’t exist for many decades, assuming they are even possible, considering that driving a car is one of the most complex tasks humans do regularly – a game with 2 rules and 361 squares can’t compare.
Given that, the AGI is not even worth talking about, but it won’t stop grifters from promising that in 5 years we will have this or in 20 years we will have that. Nobody remembers though that the same stuff has been said over half a century ago:
AI research began in the mid-1950s. The first generation of AI scientists were convinced that general intelligence was possible and that it would exist in just a few years. Pioneer Herbert A. Simon wrote in 1965: ‘machines will be capable, within twenty years, of doing any work a man can do.’
If it wasn’t bad enough, ironically, one of the most challenging problems in the field is producing a sound definition of what exactly AI (and intelligence/thinking in general) is. Every time a new “AI” is created, it’s promptly degraded by the community claiming that it’s not actually intelligence and, hence, doesn’t deserve to be called AI.
The phenomenon is called “AI effect” and as a result some cynically claim that AI can only be a solution to the problems we have not solved yet (in fact, it has an official name, it’s called Larry Tesler’s Theorem: “AI is whatever hasn’t been done yet.”). Or as a researcher Rodney Brooks says:
“Every time we figure out a piece of it, it stops being magical; we say, ‘Oh, that’s just a computation.’”
Oh, if only they knew. As Keith Woods pointed out in his excellent article about Google’s “sentient” AI, the only thing a computer can do is computation consciousness is not computation and that there is an unbridgeable qualitative gap between humans and computers.
This should be obvious to any rational person, especially those who are meant to create our intelligentsia. But neither overpaid software engineers nor underpaid STEM researchers understand said differences and keep propagating computationalism, functionalism, and other bogus ideas.
If normal academics are bad at philosophy, software engineers are outside-the-scale bad. It’s one of the best paying professions these days which is also rather easy to get in and, in many cases, requires no real effort or skill.
After a few years of getting high six digits doing nothing they often venture into philosophy and quickly get infected with socialism, transhumanism, and radical materialism. The delusion is so profound that some claim any feedback system, such as that of a thermostat, could be conscious.
For any Christians reading this, I hope a knee-jerk reaction is to laugh and rightfully ridicule such absurd supposition. From the point of a normie atheist it really isn’t that crazy to believe though. I mean, if humans are just meat robots called animals that came to be from non-living matter billions of years ago then why can’t robots (or thermostats) be conscious too? This only reminds us, the dissidents, that true change requires rejection of all stipulations of modernity.
Coming back to AI, though the hype is severely overblown for the sake of gathering research funding, there are still reasons to worry. Not because the calculator might decide that it’s best to destroy humanity after it’s given too much control over our lives, but because most AI research is directed toward spying and military technology – the Big Brother is getting bigger. There is not much to do to counteract this dystopian reality outside of some small, almost symbolic, changes.
The obvious start is to limit the amount of data we leave online, choose free and open-source software alternatives to proprietary ones (for example, use Linux, Signal, GrapheneOS, etc.). But again, those are just symbolic, a true change will only come when we will re-establish the Inquisition.
In this not-so-short rant, I focused mostly on physics and mathematics since they are the most rigorous and so when one speaks of a scientific method, he usually thinks of a physical experiment. Fields like biology and medicine have well developed, almost indestructible dogmas that are plainly false – too reductionistic – as a result of multi-billion marketing campaigns funded by Big Pharma Inc. (SSRIs are a good example of that). Fields of social sciences are haunted by replication crisis and plagued with lack of competence especially in the area of statistics.
Moreover,psychology was established by sexual predators of anti-Christian origin. If the state of physics/mathematics is pathetic, the rest of science is bound to be orders of magnitude worse.
Concluding, science is gay because the world is gay, but it needs not to be so forever. Science in its pure form is central to a healthy society. Hence, I hope that more pure-hearted Catholic men will seek to nurture their desire for the ideal of Truth by trying to advance our current state of knowledge, even as a hobby, and bring science back under the dominion of the sane.
Share this: