Monday, January 9, 2012

What is NGN

A Next-Generation Network (NGN) is the term given to describe a telecommunications packet-based network that handles multiple types of traffic (such as voice, data,  and multimedia). It is the convergence of  service provider networks that includes the public switched telephone network (PSTN), the data network (the Internet), and, in some instances, the wireless network as well.
* The NGN system offers key convergent multimedia services using a shared network characterized by several essential elements:
  • A unique and shared core network for all types of access and services.
  • A core network architecture divided into three layers: Transport, Control and Services.
  • Development of packet mode transport (IP flow transport in native IP, or on ATM in the short term with a progressive convergence to IP).
  • Open and standardized interfaces between each layer, and in particular for the Control and Services layers in order to allow third parties to develop and create services independent of the network.
  • Support for multiple applications (multimedia, real-time, transactional, total mobility) adaptable to the user and growing and varied capacities of access networks and terminals. * [Adapted from Moving towards the Next Generation Networks (NGN)]

Thursday, January 5, 2012

What is a Dual Core Processor

A dual core processor is a CPU with two separate cores on the same die, each with its own cache. It's the equivalent of getting two microprocessors in one.
In a single-core or traditional processor the CPU is fed strings of instructions it must order, execute, then selectively store in its cache for quick retrieval. When data outside the cache is required, it is retrieved through the system bus from random access memory (RAM) or from storage devices. Accessing these slows down performance to the maximum speed the bus, RAM or storage device will allow, which is far slower than the speed of the CPU. The situation is compounded when multi-tasking. In this case the processor must switch back and forth between two or more sets of data streams and programs. CPU resources are depleted and performance suffers.
In a dual core processor each core handles incoming data strings simultaneously to improve efficiency. Just as two heads are better than one, so are two hands. Now when one is executing the other can be accessing the system bus or executing its own code. Adding to this favorable scenario, both AMD and Intel's dual-core flagships are 64-bit.
To utilize a dual core processor, the operating system must be able to recognize multi-threading and the software must have simultaneous multi-threading technology (SMT) written into its code. SMT enables parallel multi-threading wherein the cores are served multi-threaded instructions in parallel. Without SMT the software will only recognize one core. Adobe® Photoshop® is an example of SMT-aware software. SMT is also used with multi-processor systems common to servers.
A dual core processor is different from a multi-processor system. In the latter there are two separate CPUs with their own resources. In the former, resources are shared and the cores reside on the same chip. A multi-processor system is faster than a system with a dual core processor, while a dual core system is faster than a single-core system, all else being equal.
An attractive value of dual core processors is that they do not require a new motherboard, but can be used in existing boards that feature the correct socket. For the average user the difference in performance will be most noticeable in multi-tasking until more software is SMT aware. Servers running multiple dual core processors will see an appreciable increase in performance.
Multi-core processors are the goal and as technology shrinks, there is more "real-estate" available on the die. In the fall of 2004 Bill Siu of Intel predicted that current accommodating motherboards would be here to stay until 4-core CPUs eventually force a changeover to incorporate a new memory controller that will be required for handling 4 or more cores.

What is a Computer Virus

A Computer Virus is a relatively small software program that is attached to another larger program for the purpose of gaining access to information or to corrupt information within a computer system. Some computer viruses may be relatively harmless. For example, some of them just cause a certain message to pop up on a user's computer screen. Other viruses can be deadly to the computers they infect, erasing information and hard drives, stealing data, and slowing down the entire computer system. Like other software programs, someone must create and write a computer virus; once they are created, viruses can multiply rapidly and spread themselves from computer to computer.

Computer Virus Definition & Characteristics:

A Computer Virus is a program that can copy itself and infect a computer without the permission or knowledge of the user. A computer virus has 2 major characteristics: the ability to replicate itself, and the ability to attach itself to another computer file. Every file or program that becomes infected can also act as a virus itself, allowing it to spread to other files and computers. The term "computer virus" is often used incorrectly as a catch-all phrase to include all types of Malware such as Computer Worms, Trojan Horses, Spyware, Adware, and Rootkits - all of which are slightly different than Computer Viruses.

A computer virus needs another program in order to be able to be activated and infect other computers files. Essentially, a computer virus rides piggyback on another file into your computer - once it is executed, the virus will continue to replicate and attach itself to other program files and continue to spread.

A Brief History of Computer Viruses:

Although computer scientists were aware of the theoretical possibility of computer viruses for decades, it was not until the 1980s that viruses began to gain a hold and multiply in large numbers. With the advent of the personal computer, floppy disk drives, and other portable information storage devices, it became easier to program and transfer viruses from one machine to another. Fred Cohen is often cited as the first person to use the term "computer virus" in an academic paper in 1984, although some suggest that he may have learned it from his mentor Leonard Adleman

Does My Computer Have a Virus?

Of course, if you ask anyone the question “what is a Computer Virus?”, the reply is likely going to be a bit jumbled. There are many different forms of computer viruses, although many have the same goal; to slow the infected computer system, steal and/or copy information from the computer, and attach itself to sent files and attachments in order to spread to other computer systems. The following are a few telltale signs that your computer might have a virus:
• Slow response and slow program execution
• Random hard drive crashes and restarts
• Distorted graphics and text
• Files that have mysteriously vanished
• Extensive pop-up ads
• Inability to open files with existing passwords
How Can I Protect My Computer from a Virus?

It is far better to avoid computer viruses and take proactive measures to protect your computer system than it is to clean up after them once they have gotten into your computer and caused damage. The following tips will help increase your odds of avoiding infection from a computer virus:
• Download programs only from trusted, reputable websites
• Install a quality Internet firewall
• Do Not open suspicious emails or email attachments
• MOST IMPORTANTLY — Make sure you have a trusted anti-virus program installed on your computer - such as Norton Antivirus, NOD32 Antivirus, or Kaspersky Antivirus.

Monday, January 2, 2012

What is Linux

A lot of the advantages of Linux are a consequence of Linux' origins, deeply rooted in UNIX, except for the first advantage, of course:
  • Linux is free:
    As in free beer, they say. If you want to spend absolutely nothing, you don't even have to pay the price of a CD. Linux can be downloaded in its entirety from the Internet completely for free. No registration fees, no costs per user, free updates, and freely available source code in case you want to change the behavior of your system.
    Most of all, Linux is free as in free speech:
    The license commonly used is the GNU Public License (GPL). The license says that anybody who may want to do so, has the right to change Linux and eventually to redistribute a changed version, on the one condition that the code is still available after redistribution. In practice, you are free to grab a kernel image, for instance to add support for teletransportation machines or time travel and sell your new code, as long as your customers can still have a copy of that code.
  • Linux is portable to any hardware platform:
    A vendor who wants to sell a new type of computer and who doesn't know what kind of OS his new machine will run (say the CPU in your car or washing machine), can take a Linux kernel and make it work on his hardware, because documentation related to this activity is freely available.
  • Linux was made to keep on running:
    As with UNIX, a Linux system expects to run without rebooting all the time. That is why a lot of tasks are being executed at night or scheduled automatically for other calm moments, resulting in higher availability during busier periods and a more balanced use of the hardware. This property allows for Linux to be applicable also in environments where people don't have the time or the possibility to control their systems night and day.
  • Linux is secure and versatile:
    The security model used in Linux is based on the UNIX idea of security, which is known to be robust and of proven quality. But Linux is not only fit for use as a fort against enemy attacks from the Internet: it will adapt equally to other situations, utilizing the same high standards for security. Your development machine or control station will be as secure as your firewall.
  • Linux is scalable:
    From a Palmtop with 2 MB of memory to a petabyte storage cluster with hundreds of nodes: add or remove the appropriate packages and Linux fits all. You don't need a supercomputer anymore, because you can use Linux to do big things using the building blocks provided with the system. If you want to do little things, such as making an operating system for an embedded processor or just recycling your old 486, Linux will do that as well.
  • The Linux OS and most Linux applications have very short debug-times:
    Because Linux has been developed and tested by thousands of people, both errors and people to fix them are usually found rather quickly. It sometimes happens that there are only a couple of hours between discovery and fixing of a bug.