Friday 18 August 2017

The Purity Principle, Randy Alcorn

“Confessing your sins is great, confessing your temptations is even greater" - Randy Alcorn

This article is a summary of the book The Purity Principle by Randy Alcorn. The book discusses sexual purity, why is important to maintain sexual purity and how we can achieve the same. Randy quotes the scripture as and when required in order to authenticate the message he wants to convey to the reader through this book. Nobody is free from temptations. Even pastors are prone to temptations. Throughout the book, Randy shares his own experiences and thus the reader is able to connect with the author.

The book begins with a brief account of the lives of various people such as Eric and Tiffany (fictitious names, probably) who have pursued immoral, impure paths of life and ended up with disastrous results. Although God forgave their mistakes, they had reached a point of no return in their lives. This was because, as Randy asserts, punishment is built into sins we commit. Randy views purity as wise and impurity as stupidity.

Sin can be of various types. However sexual sin is different from other kinds of sin. Sodom and Gomorrah, the cities destroyed by God, had fallen deeply into sexual sin. Paul reiterates this in his letters to the Corinthians (1 Cor 6:18) and the Thessalonians (1 Thess 4: 3-8). An interesting point to note is that sex is more of what you are, rather than what you do. Sex defines us. Thus it is important to purify ourselves by keeping away from sexual sin. Also, by accepting Jesus as the Christ, we were bought at a price and hence, our body belongs to God (1 Cor 6:19-20). As a result, we cannot do anything that we please with our body.

When it comes to temptations, Christians are the most vulnerable because of the constant threat from the devil. It is important to be careful about the mind to maintain purity. Sometimes, it is better to avoid temptation rather than resist temptation. Notice how avoid is different from resist. If we keep resisting, we are subjecting ourselves to the temptation. On the other hand, when we avoid temptations, we are staying away from them altogether.

Randy lists down various aspects of life where it is necessary to maintain a pure mind and body and thereby follow the purity principle. In spite of the wickedness of larger sins such as infidelity, it is important to focus on the lesser ones that feed our minds on a daily basis in the form of media, peers, novels, movies and arts. For example, if we regularly watch a TV programme that shows immoral content then avoid it. We live in an era where media and technology (e.g. Whatsapp, Facebook, YouTube, Instagram) can control our lives immensely. We have to have a prudent approach towards the manner in which we use these resources. Randy contrasts his words with what Jesus instructed in Mathew 5:29. Randy claims that whatever he instructed is nothing radical compared to Jesus’ command.

Randy makes it a point to handle the cases of singles, and married couples differently. Single, unmarried men and women should prepare themselves for their marriage. Randy explores various issues related to dating, masturbation, and peer pressure. Now for married couples and parents, Randy observes that umpteen marriages have suffered due to dishonesty by either of the partners. Even after entering into marriage, we are not free from sexual temptations. It is necessary for couples to be open with each other about everything, especially sexual temptations, and request prayers from one another.

The book concludes with a discussion on maintaining accountability with others. Randy recalls how on one occasion he underwent temptation and how the temptation vanished after a phone call with his friend. Thus, we are in a constant battle with the devil. However, this is a battle that we can win. It is indeed possible to avoid temptations and thereby maintain sexual purity and holiness.

And that's what The Purity Principle is all about!

Sunday 9 July 2017

Blogs/Websites that I follow

I share here some of the blogs that I follow. Most of the articles in these blogs are about research, education, PhD, and computer science.
I hope these blogs will be of help to you.

Sunday 12 February 2017

Maps

The map lists various areas of study that come under the aegis of mathematics. It is a Herculean task to enumerate all the disciplines in mathematics. Dominic has done an amazing job here.
  • Foundations - Fundamental Rules, Mathematical Logic, Set Theory, Category Theory, Theory of Computation, Complexity Theory 
  • Pure Mathematics
    • Number Systems - Natural Numbers, Integers, Rational Numbers, Real Numbers, Complex Numbers,
    • Structures - Number Theory, Combinatorics, Algebra, Linear Algebra, Group Theory, Order Theory
    • Spaces - Geometry, Trigonometry, Fractal Geometry, Differential Geometry
    • Changes - Calculus, Vector Calculus, Chaos Theory, Dynamical Systems, Complex Analysis
  • Applied Mathematics - Numerical Analysis, Game Theory, Economics, Engineering, Computer Science, Machine Learning, Probability, Statistics, Cryptography, Optimization, Biomathematics, Mathematical Physics, Mathematical Chemistry

Tuesday 3 January 2017

Vijay Amritraj @ IITM

I had the opportunity to listen to Vijay Amritraj as part of Shaastra '17.

He is a great orator. He has worked as the messenger of peace in United Nations, directly reporting to the then UN general secretary Kofi Annan. The Vijay Amritraj Foundation takes up various social causes.

He is confident of the improvements taking place in the sports scenario of the country. When he was playing tennis in the 70s, cricket was the only sport in India. Now, people in India pursue and follow various other sports such badminton, chess, shooting, kabaddi, and wrestling.

He commented that people in our country like to play everything safe. That is the reason why still not many people pursue sports as a career. He is confident that there is talent in the country.

He remarked that technological advents, such as computer ranking and hawk eye prediction, has impacted tennis in many ways. The average height of players has gone higher. The strength of rackets has increased. The surface has become slower. The ball has become heavier. The quality of the game has improved a lot.

Thursday 22 December 2016

To Kill A Mockingbird, Harper Lee

Awesome Read! 

In To Kill A MockingbirdLee discusses a lot of issues - gender disparity, class distinction, race division, capital punishment, rape, The Great Depression, and ethics and morality - as viewed from the perspective of the young girl, Scout. TKAM is one of the best fictional novels I have ever read.

To spice things up, I gave my own titles to each chapter in the book.

Part I Part II
Chapter 1 - Radley Opening Chapter 12 - Cal Church
Chapter 2 - Morning Sickness Chapter 13 - Finch Pride
Chapter 3 - Afternoon Show Chapter 14 - Dill Flee
Chapter 4 - Vacation Drama Chapter 15 - Cunningham Encounter
Chapter 5 - Tweet Radley Chapter 16 - The Courthouse
Chapter 6 - Peep Talk Chapter 17 - Tate-Bob Witness
Chapter 7 - Thank Cement Chapter 18 - Mayella Witness
Chapter 8 - Tundra Blaze Chapter 19 - Tom Witness
Chapter 9 - Landing Tussle Chapter 20 - Closing Remarks
Chapter 10 - Dead Shot Chapter 21 - The Verdict
Chapter 11 - Dubose Deadly Chapter 22 - Tears of Injustice
Chapter 23 - Gender-Class-Race Divide
Chapter 24 - Missionary Tea Hypocrisy
Chapter 25 - Maycomb Tribune
Chapter 26 - Grace Double Standards
Chapter 27 - Back to Normal
Chapter 28 - Bob Attack
Chapter 29 - Boo Save
Chapter 30 - Alternate Story
Chapter 31 - All is Well

Tuesday 6 December 2016

English Grammar Punctuation

Notes from Eats, Shoots& Leaves (Lynne Truss)
 
Traditionally punctuation made it easier to read text aloud or to signal a pause. This was especially useful for actors on stage. In modern usage, punctuation serves additional functions such as indicate emphasis, for syntactic reasons or to avoid ambiguity.

Every publication house follows different style guides for punctuation. Additionally, the British usage differ from the American one (e.g. usage of punctuation within quotation marks).

Apostrophe: possessive marker (e.g. Jack's, boy's, boys'), to indicate omission (e.g. summer of '69), indicate time or quantity (e.g. two month's notice), plurals of letters and words (e.g. f's, do's and don't's); no need to use for plurals or abbreviations (e.g. MPs and MLAs) or dates (e.g. 1980s)

Comma: for lists (e.g. Tom, Dick and Harry), for joining complete sentences, bracketing commas (instead of em-dash or parenthesis);

Semicolon and Colon: to indicate pause and emphasis

Exclamation mark, italics, quotation marks (single and double), brackets (round, square, curly, angle)

How to choose between single and double quotation mark?
How to choose among round bracket, em-dash and comma?

Hyphen: to combine words (e.g. pre-train), when a noun phrase acts as an adjective (e.g. state-of-the-art model), to split unfinished words at the end of a line, to avoid ambiguity (e.g. re-formed vs. reformed)
 
Punctuation Marks
  • Full stop 
    Alice met Bob.
  • Comma
    Alice gave Bob a pen, paper, and a pencil. 
    Alice, a student, met Bob.
  • Semicolon
    Alice gave Bob a paper; Bob took it reluctantly.
  • Colon
    Alice gave Bob a few items: a pen, a paper, and a pencil.
  • Question mark
    Did Alice meet Bob?
  • Exclamation mark
    Hurray, we won! Yipee!
  • Quotes
    ``Come,’' Alice told Bob.
  • Apostrophe denotes contraction and possession.
    it’s, Alice's, p’s, 7’s, 1990s, MPs
  • Hyphen
    Does your organization have a by-law?
  • Dash denotes comment
    Alice will not come - I hope so.
  • Parentheses denotes supplementary information.
    Alice (a student) met Bob.
 
Character
Code Point
Name
Purpose
u2010
Hyphen
To represent compound terms
u2014
Em dash
In place of commas, parentheses
(use em dash sparingly and instead use the alternatives)
u2013
En dash
To denote ranges
u2212
Minus
To represent subtraction
-
u002D
Hyphen-minus
ASCII hyphen

Friday 16 September 2016

Information Theory, Khan Academy

Recently, I have been going through some interesting videos on information theory. I am posting the link here: Information Theory, Khan Academy, for the benefit of those who want to learn something new, fundamental and interesting. I will try to summarize what I understood from the lectures below.

1. Ancient Information Theory
It is interesting to note how humans started communicating with each other. Ancient humans used pictographs and ideograms engraved on rocks and in caves to share information. Later on, symbols and alphabets were devised for ease of communication.

With the passage of time, advanced communication technologies were developed. For instance, Greeks and Romans used torches, especially in battles, for quick long-distance communication. In the 17th century, shutter telegraphs were the norm. It could cover all the letters in English alphabet. With the help of a telescope, it was possible to send information across an incredible amount of distance. But still, these techniques were not sufficient for effective communication due to their low expressive powers and low speeds.

With the discovery of electricity, the information age had begun. The visual telegraphs were soon replaced by electrostatic telegraphs. It was possible to send large bits of information to long distances in short amounts of time.

2. Modern Information Theory
How fast can two parties communicate with each other? The limiting rate at which it is possible to send messages depends on symbol rate (baud) and difference. Symbol rate stands for the number of symbols which can be transferred per second. There is a fundamental limit on the distance between two pulses. Due to noise, a pulse may not be perfect. If two pulses are very close to each other, there is a very high chance for inter-pulse interference to occur between them. This makes it difficult for the receiver to decode the signal. The difference stands for the number of signaling events per symbol. Message space denotes the number of messages possible.

How is it possible to quantify information? A possible way solve this problem is by considering a scenario where the receiver asks a number of yes/no questions to the sender to receive information. Based on the minimum number of questions required to receive the complete message, it is possible to quantify information. This unit is called a binary digit or bit, in short. Mathematically, this is nothing but logarithm (to the base 2) of the size of the message space. For example, to pass the name of a book from the Bible, 6-7 bits are required. To pass the names of 2 books from the Bible, 12-14 bits are required.

Human communication is a mix of randomness and statistical dependencies. Claude Shannon uses a Markov model as the basis of how we can think about communication.  Given a message, a machine can be designed which generates a similar-looking text. As we progress from the zeroth-order approximation to first-order and second-order approximations, the similarity of the text generated by the machine with the message increases.

Entropy - Shannon gives a mathematical method to quantify information. He calls this quantity entropy H= -Σp. log2(p), where p is the probability of an outcome of the event. If all the outcomes are equally likely, then entropy is maximum. If there is some predictability in the outcomes, then entropy comes down. For example, a text with random words and letters will have higher Shannon's information than a text with "normal" letters. This seems counter-intuitive at first because information theory doesn't deal with the semantic information in the message but with the number of symbols required to communicate a message. The only way to regenerate the random text is to copy the text as is. However, it is possible to compress the "normal" test using rules due to its predictable nature.

Coding theory - If the entropy is not maximum, then it is possible to compress a message. But, what are the ways in which we can compress a message? David Huffman came up with the Huffman coding strategy to compress a message with the help of a binary tree by encoding symbols into bits. However, the limit of (lossless) compression is the entropy of the message source. The information in the message would be lost when the message needs to be compressed beyond the specified limit.

Error detection & correction - During communication, noise in the channel corrupts the message resulting in difficulty for the receiver to understand the message. How is it possible to deal with noise? Richard Hamming came up with the idea of parity bits built upon the concept of repetition. Thus, error correction is done by using more symbols to encode the same message resulting in an increase in the size of the message.

Reference
Shannon, Claude Elwood. "A mathematical theory of communication." ACM SIGMOBILE Mobile Computing and Communications Review 5.1 (2001): 3-55. 

Friday 15 July 2016

Animal Farm, A Fairy Story - George Orwell

Animal Farm is a nice political satire on the Soviet Union written by George Orwell in 1943-44. The story portrays the evil effects of Socialism in an intelligent manner.

An interesting feature of Animal Farm is that the book spans only 90 pages which makes it remain one of the favorites among the busy working class even today.

I gave my own title to each chapter in the book. Spoiler Alert!

Chapter I The Dream Proposition
Chapter II The Rebellion Theorem
Chapter III The Heydays Axiom
Chapter IV The Recapture Claim
Chapter V The Napoleon Prime
Chapter VI The Windmill Lemma
Chapter VII The Traitors Corollary
Chapter VIII The Frederick-Pilkington Conundrum
Chapter IX The Boxer-Glue Conjecture
Chapter X The Pig-Man Paradox

Tuesday 24 May 2016

A Brief History of Time, Stephen Hawking

The Uncertainty Principle
Laplace argued that universe was deterministic i.e. we can predict the changes in the state of the universe provided we know the current state of the universe. However, Heisenberg's uncertainty principle showed that the more accurate we try to measure the location of an object the less accurate the result would be. As a result, it is difficult to measure the state of the universe at any given point of time.

Plank's quantum hypothesis and Heisenberg's uncertainty principle led to the theory of quantum mechanism where position of an object is defined in terms of probabilities i.e. an object would be at position A in time B with some probability C.

The dual nature of light is an implication of quantum mechanics. Quantum hypothesis said that light energy, which was thought to be composed of waves, was dissipated in terms of particles called quanta. The uncertainty principle said that particles may seem to be occurring at multiple positions based on the measurement. Interference of waves as well as particles (double-slit experiment) was observed.

The interference of particles helped physicists in understanding the nature of orbits of electrons in an atom. There are only a finite number of valid orbits in an atom because of the positive interference of electrons around the nucleus. The negative interference of electrons leads to the unavailability of certain orbits around the nucleus.

Einstein's general theory of relativity (classical theory) does not take the theory of quantum mechanism into consideration. It is necessary to combine both the theories in order to have a general, unified, consistent theory.

Roger Schank Blog

Roger Schank Blog 

Roger Schank in the latest article in his blog argues the AI is way more than just keyword matching and that an AI program should be able to exchange thoughts, hypotheses and solutions with other programs/humans.
 
He was commenting on the current state of AI research in the context of the massive attention media is giving to the news about a Georgia Tech professor announcing at the end of his course that one of his TAs was actually an “AI”. In fact, the “AI” he was referring to was nothing better than programs such as MARGIE, ELIZA and PARRY which were written in the 1970s and performed simple keyword matching. 

He predicts of an impending AI winter 2.0 due to the skewed perception media and, in turn, people have about the real potential of AI. He says there are important questions to consider in the field of AI such as:
  • Can we build machines that think, wonder, remember, feel and understand?
  • Is it enough if we continue to build such machines which do not perform any of the above actions but are still useful in one way or the other?