Technology And Coding
Computers & Technology → Technology
- Author Johnson Philip
- Published November 15, 2009
- Word count 1,205
When a given object represents something else, it is either a symbol or a code. Symbol, if it represents only one object and (often) code if it has the power to represent more than one object.
The number system was the earliest code developed by man. Once it went through about five stages of development, it became powerful enough to represent real-life phenomena in terms of mathematics. Such a representation can be called codes because the process of representation is not straight like a symbol, and the process of understanding that representation requires considerable decoding.
The number system became a powerful way of coding numbers after about five stages of development, but technology soon helped coding to be applied to higher levels of thought. The earliest use of technology was in the form of Abacus. Several millennia after that came Slide Rules that took mathematical coding through a very large leap. Here numbers are converted to logarithms (which can be called a type of numerical code) which was then converted to proportional length. Once numbers were coded into logs and then logs into length, representing very large numbers into lengths, then solving large computations became easy. Coding took advantage of technology to make age-old problems easy to handle.
Coding took another leap when analog counting machines based mostly on interlocking teeth were invented. Thanks to Charles Babbage, very advanced computing machines were conceived and some were made. These were based on systems of wheels and gears. This was another level to which technology helped numerical coding. The next stage was small relay-based computers that paved the way for binary-coded decimal computations. Meanwhile the arrival of Analog Computers did for computation what the Slide Rule had done for engineers. However, all of this was only preparing the ground for ENIAC, the first machine that opened the way for modern computers.
While a computer looks like a simple computing-machines, the actual working is heavily dependent upon coding. At a very early stage they realized that computers cannot "represent" numbers the way symbols like 3, 5, or 7 represent on paper. Thus binary numbers, which have only two digits were the stuff of computers, and continue to be so. Since on/off positive-pulse/negative-pulse type of information is the only thing where there is mathematical certainty of what the signal is, in spite of random corruption, the decimal system that we are used to was coded into binary. Then the 0 and 1 (the only numerals in binary) were coded into the two allowed states of vacuum tube valves or (eventually) transistors.
In effect, the two clear-cut states of a signal (say, on/off) were used to represent (code for) binary numbers, which were used to represent decimal numbers, and so on. Thus eventually, the processing of digital electronic signals was used to represent decimal numbers, through several levels of coding. This was the ultimate in technological help in coding for numbers. This in turn gave rise to the next stage of coding, which has now brought computers into everyday life.
Computers, Everyday Life, Coding
The biggest dream of mankind, at least for the last one millennia, has been to invent machines that can take over the routine activities in our daily life. Soon this developed into a more specialized outlook where there was dream was not restricted to machines that would take over routine activities, but was expaned to include machines that would take over dangerous activities, that would handle complex activities such as massive computation etc.
Soon the dream also included machines that can behave like humans, make decision, spot crime, control traffic, mimic life, and even "think" like humans if possible. obviously, no computer can do any of the above things, including mathematical calculations. All what it can do is to produce two kinds of distinct signals, positive and negative or zero and non zero. Combination of these on-off states in batches of four, eight, or sixteen are then organized into codes that represent binary numbers.
Originators of binary have developed an elaborate set of rules using which the binary can stand and be used as code for decimal number. The turning point came when using decimal number they began representing all kinds of social phenomena such as the text of this essay. The coding is called ASCII. In this system of coding, various decimal numbers (not decimal fractions) represent actual numerals, small and capital letters, and selected symbols. This code is manipulated by word-processors to produce the neatly laid down text which has become an everyday scene for computer users. Once a database is available, code-manipulation can also be used to create a system for railway reservation, college grade-charts, dictionaries and thesauri.
Statisticians have developed methods to represent real-life phenomena in terms of mathematics. Similar efforts are afoot in economics, psychology, meteorology, and scientific modeling. Fuzzy logic is a good example. No computer can work in a fuzzy manner. A transistor is either on or off, and a signal is either positive or negative. It cannot be part positive and part negative, it cannot be fuzzy. But by associating probability values to various phenomena, fuzzy logic comes into play when assessing those phenomena.
Thus when one drops a bunch of clothes into a modern washing machine, some of them use light reflected from the clothing to assess whether the clothes are clean, dirty, or very dirty. That is a preliminary application of fuzzy logic where, instead of saying that a cloth has 75 units of dirt, the machine says that it is "very dirty". Multiple sets of elaborate code has to be employed before such a choice is made by the machine.
There is flip-flop of transistors at the first level. This is followed by layers of binary and then decimal numbers. Meanwhile the intensity of the reflected light is converted into corresponding numbers, with the number now standing as a decimal "code" for how dirty the cloth is. This number is "interpreted" by the decimal system to assess how dirty the cloth is.
The description above is essentially very simple compared to the actual complexity of coding in everyday life, but it does catch the essence of the multi-layer coding in computers that makes life easy today. The computers neither think nor deduce. Rather, physical phenomena are used as codes that represent information. Elaborate arithmetical and logical rules have been formulated so as to use this physical phenomena to represent mathematical operations, which in turn have been used to represent non-mathematical activities such as ticketing in airlines.
Advances in technology makes this multi-layer coding faster, more reliable, and able to handle ever larger numbers. This in turn helps to represent non mathematical phenomena in terms of computer-code more easily because non mathematical phenomena demand vast amounts of memory and computational power for representing them with reasonable efficiency.
The race is on to make machines that can deliver more of it, faster than what is possible at present, without compromising on reliability. In other words, the race is on to use multiple-levels of coding to do ever-increasing volumes of computation in real-time so as to attain the goal of computers that can eventually do everything that is still the stuff of Science Fiction. Perhaps that day is not too far.
Dr. Johnson C. Philip, the President of TGSAT, is an internationally known physicist, theologican, and communicator
Article source: https://articlebiz.comRate article
Article comments
There are no posted comments.
Related articles
- The Real Risk of Tokenized Assets: Legal Black Holes
- Why the Best Colocation in Israel Could Save Your Infrastructure – 10 Questions Every IT Leader Should Ask.
- Top CRM Tools to Manage and Track Solar Appointments Efficiently
- Influence of People Counting in Optimizing Staff Scheduling and Preventing Lost Sales
- Digital Silence: Creative Uses of Invisible Characters in Everyday Online Life
- 8 Challenges in B2B Logistics and How Moovick Solves Them
- Fix 'OLM File Not Opening' Error in Windows – Complete Guide
- Expert Managed IT Support in Washington, DC for Modern Businesses
- Why Outsourced HR Payroll Services Are a Game-Changer for Growing Businesses — Ignite HCM
- Why Modern Businesses Need Performance Management Software to Stay Competitive
- 5 Steps to Creating an Effective Payroll Contingency Plan – Ignite HCM
- Why Payroll Consulting Services Are a Smart Investment for Growing Businesses — Ignite HCM
- Free Test Management Tools: Top Picks for QA 2025
- Prompt Engineering in Salesforce: How to Optimize Prompts for Einstein GPT
- Vancouver E-Commerce Alert: The Top Tech Trends from ChatGPT to Cainiao
- Mastering Timesheet Approvals for Business Owners: From Bottlenecks to Breakthroughs.
- Top Benefits of Using Competency Management Software in Modern Organizations
- Maximize ROI with Personalized and Automated Lead Nurturing Solutions
- Maximize Sales Funnel Efficiency with Smart Automated Lead Nurturing Systems
- Boost Revenue and Team Efficiency with the Right Sales Enablement Platform Today
- How Bullseye Engagement’s Competency Tracking Software Enhances Workforce Performance
- Lesson Management Systems: Shaping the Future of K-12 Schools
- What Are the Features and Tech Stack of Web3 Game in UAE?
- Artsyl Technologies Recognized Among Top Performers in Accounts Payable Automation
- The Role of 5G and Advanced Networks in the United States: Bridging the Digital Divide
- ADA Price Prediction: What Experts Say About Cardano's Future Value
- BOSS Continual Improvement Software – The Backbone of Agile Quality Management Omnex Systems
- The Rise of Software to Monitor Employee Computer Activity: Balancing Productivity, Privacy, and Ethics
- Electronic Document Management System: Efficiency and Challenges in the Digital Age
- Why Shift Scheduling Matters: How to Optimize Labor Costs in Business Central