IEEE Transactions – IEEE Transaction Projects on
1. IEEE Transactions on Communication
2. IEEE Transactions on Computers
3. IEEE Transactions on Data Mining
4. IEEE Transactions on Dependable and Secure Computing
5. IEEE Transactions on Distributed Computing System
6. IEEE Transactions on Image Processing
7. IEEE Transactions on Information Forensics and Security
8. IEEE Transactions on Internet Computing
9. IEEE Transactions on Knowledge and Data Engineering
10. IEEE Transactions on Learning Technologies
11. IEEE Transactions on Mobile Computing
12. IEEE Transactions on Multimedia Computing
13. IEEE Transactions on Network Computing
14. IEEE Transactions on Network Security
15. IEEE Transactions on Parallel and Distributed Systems
16. IEEE Transactions on Pattern Analysis & Machine Intelligence
17. IEEE Transactions on Services Computing
18. IEEE Transactions on Signal Processing
19. IEEE Transactions on Software Engineering
20. IEEE Transactions on Vehicular Technology
21. IEEE Transactions on Visualization and Computer Graphics
22. IEEE Transactions on Web Application and Web Service
23. IEEE Transactions on Wireless Communication
24. IEEE Transactions on Wireless Sensor Networks
25. IEEE Transactions on Fuzzy systems
26. IEEE Transactions on Neural Networks
IEEE - Institute of Electrical and Electronics Engineers
A non-profit organization, IEEE is the world's leading professional association for the advancement of technology.
pronounced - Eye-triple-E
IEEE's Constitution deems the purposes of the organization as "scientific and educational, directed toward the advancement of the theory and practice of electrical, electronics, communications and computer engineering, as well as computer science, the allied branches of engineering and the related arts and sciences."
In implementing these goals, the IEEE serves as a major publisher of scientific journals and a conference organizer. It is also a leading innovator of industrial standards in a broad range of disciplines, including electric power and energy, biomedical technology and healthcare, information technology, information assurance, telecommunications, consumer electronics, transportation, aerospace, and nanotechnology.
IEEE develops and participates in educational activities such as accreditation of electrical engineering programs in institutes of higher learning. The IEEE also serves student members in colleges and universities around the world. Prospective members and organizations purchase IEEE products and participate in conferences or other IEEE programs.
SCOPE OF IEEE:
IEEE Computer Architecture Letters is a bi-annual forum for fast publication of new, high-quality ideas in the form of short, critically refereed, technical papers.
Submissions are accepted on a continuing basis, and accepted letters will be published immediately in the IEEE Digital Library and in the next available print issue. Members of the Technical Committee on Computer Architecture will receive the print issue as a benefit of being a member. Authors should submit their manuscript through Manuscript Central.
JAVA
Java is a programming language originally developed by James Gosling at Sun Microsystems (which is now a subsidiary of Oracle Corporation) and released in 1995 as a core component of Sun Microsystems' Java platform. The language derives much of its syntax from C and C++ but has a simpler object model and fewer low-level facilities. Java applications are typically compiled to bytecode (class file) that can run on any Java Virtual Machine (JVM) regardless of computer architecture. Java is a general-purpose, concurrent, class-based, object-oriented language that is specifically designed to have as few implementation dependencies as possible. It is intended to let application developers "write once, run anywhere." Java is currently one of the most popular programming languages in use, particularly for client-server web applications.
The original and reference implementation Java compilers, virtual machines, and class libraries were developed by Sun from 1995. As of May 2007, in compliance with the specifications of the Java Community Process, Sun relicensed most of its Java technologies under the GNU General Public License. Others have also developed alternative implementations of these Sun technologies, such as the GNU Compiler for Java, GNU Classpath, and Dalvik.
James Gosling, Patrick Naughton, Chris Wrath, Ed Frank, and Mike Sheridan conceived Java at Sun Micro system.
It is a platform independent programming language that extends its features wide over the network. Java2 version introduces a new component called “Swing” – is a set of classes that provides more power & flexible components than are possible with AWT. - It’s a lightweight package, as they are not implemented by platform- specific code.
Related classes are contained in javax.swing and its sub packages, such as javax.swing.tree. -Components explained in the Swing have more capabilities than those of AWT. The Java platform differs from most other platforms in that it’s a software-only platform that runs on top of other, hardware-based platforms.
Principles of Java
There were five primary goals in the creation of the Java language
1. It should be "simple, object oriented and familiar".
2. It should be "robust and secure".
3. It should be "architecture neutral and portable".
4. It should execute with "high performance".
5. It should be "interpreted, threaded, and dynamic".
DOTNET
.NET is the Microsoft’s development model in which software becomes platform and device independent and data becomes available over the Internet. The .Net Framework is the infrastructure of .NET. . NET is built from the group up on open architecture. . NET is a platform that can be used for building and running the next generation of Microsoft Windows and Web applications. The goal of the Microsoft .NET platform is to simplify web development.
The .Net Framework provides the foundation upon which application and XML web services are build and executed the unified Nature of the .Net Framework means that all applications, whether they are windows applications, web applications are XML web services are developer by using a common set tools and code, and are easily integrated with one another. Benefits of using .NET Framework:
The benefits of using the .Net Framework for developing application include:
· Based on Web standards and practices The .Net framework fully supports existing Internet technologies, including HTML, HTTP, XML, SOAP and other Web standards.
· Design using unified application models The functionality of a .Net class is available from any .Net compatible languages are programming model. Therefore, the same piece of code can be used by windows applications, web applications and XML web services.
Easy for developers to use The .NET Framework provides the unified type system, which can be used by any .NET-compatible language. In the unified type system, all language elements are objects. These objects can be used by any .Net applications written in any .NET-based language.
· Extensible classes The hierarchy of the .Net Framework is not hidden from the developer. You can access and extend .Net classes through inheritance
MOBILE COMPUTING
Mobile computing is a generic term describing one's ability to use technology while moving, as opposed to portable computers, which are only practical for use while deployed in a stationary configuration. Mobile internet access is generally slower than direct cable connections, using technologies such as GPRS and EDGE, and more recently 3G networks. These networks are usually available within range of commercial cell phone towers. Higher speed wireless LANs are inexpensive, but have very limited range.
Mobile Computing, IEEE Transactions on
The IEEE Transactions on Mobile Computing focuses on the key technical issues related to (a) architectures, (b) support services, (c) algorithm/protocol design and analysis, (d) mobile environments, (e) mobile communication systems, (f) applications, (g) components, including devices, hardware, and software, (h) implementation issues, including interference, power, and software constraints of mobile devices,
NETWORKING
A network is a group of interconnected computers. Networks may be classified according to a wide variety of characteristics. This article provides a general overview of some types and categories and also presents the basic components of a network. A computer network allows computers to communicate with each other and to share resources and information.
This publication is devoted to the timely release of high quality papers that advance the state-of-the-art and practical applications of computer networks, this journal covers such topics as: network architecture and design, communication protocols, network software, network technologies, network services and applications, and network operations management.
DATA SECURITY
Data security is the means of ensuring that data is kept safe from corruption and that access to it is suitably controlled. Thus data security helps to ensure privacy. It also helps in protecting personal data. Software based security solutions encrypt the data to prevent data from being stolen. However, a malicious program or a hacker may corrupt the data in order to make it unrecoverable or unusable. Similarly, encrypted operating systems can be corrupted by a malicious program or a hacker, making the system unusable.
Hardware-based security solutions can prevent read and write access to data and hence offers very strong protection against tampering and unauthorized access. Hardware based or assisted computer security offers an alternative to software-only computer security. Security tokens such as those using PKCS#11 may be more secure due to the physical access required in order to be compromised. Access is enabled only when the token is connected and correct PIN is entered.
CLOUD COMPUTING
Cloud computing is a style of computing in which dynamically scalable and often virtualized resources are provided as a service over the Internet. Users need not have knowledge of, expertise in, or control over the technology infrastructure in the "cloud" that supports them.
The concept generally incorporates combinations of the following:
• infrastructure as a service (IaaS)
• platform as a service (PaaS)
• software as a service (SaaS) Cloud computing customers do not generally own the physical infrastructure serving as host to the software platform in question.
Instead, they avoid capital expenditure by renting usage from a third-party provider. They consume resources as a service and pay only for resources that they use. Many cloud-computing offerings employ the utility computing model, which is analogous to how traditional utility services (such as electricity) are consumed.
GRID COMPUTING
Grid computing (or the use of computational grids) is the combination of computer resources from multiple administrative domains applied to a common task, usually to a scientific, technical or business problem that requires a great number of computer processing cycles or the need to process large amounts of data. One of the main strategies of grid computing is using software to divide and apportion pieces of a program among several computers, sometimes up to many thousands.
Grid computing is distributed, large-scale cluster computing, as well as a form of network-distributed parallel processing. The size of grid computing may vary from being small confined to a network of computer workstations within a corporation, for example to being large, public collaboration across many companies and networks.
It is a form of distributed computing whereby a “super and virtual computer” is composed of a cluster of networked loosely coupled computers acting in concert to perform very large tasks. This technology has been applied to computationally intensive scientific, mathematical, and academic problems through volunteer computing, and it is used in commercial enterprises for such diverse applications as drug discovery, economic forecasting, seismic analysis, and back-office data processing in support of e-commerce and Web services.
DATA MINING
Data mining is the process of extracting patterns from data. As more data are gathered, with the amount of data doubling every three years, data mining is becoming an increasingly important tool to transform these data into information. It is commonly used in a wide range of profiling practices, such as marketing, surveillance, fraud detection and scientific discovery. Data mining commonly involves four classes of task: • Classification - Arranges the data into predefined groups.
For example an email program might attempt to classify an email as legitimate or spam. Common algorithms include Nearest neighbor, Naive Bayes classifier and Neural network.
• Clustering - Is like classification but the groups are not predefined, so the algorithm will try to group similar items together.
• Regression - Attempts to find a function, which models the data with the least error. A common method is to use Genetic Programming.
• Association rule learning - Searches for relationships between variables. For example a supermarket might gather data of what each customer buys.
Using association rule learning, the supermarket can work out what products are frequently bought together, which is useful for marketing purposes. This is sometimes referred to as "market basket analysis".
MULTIMEDIA
Multimedia is media and content that uses a combination of different content forms. The term can be used as a noun (a medium with multiple content forms) or as an adjective describing a medium as having multiple content forms. The term is used in contrast to media which only use traditional forms of printed or hand-produced material.
Multimedia includes a combination of text, audio, still images, animation, video, and interactivity content forms. Multimedia is usually recorded and played, displayed or accessed by information content processing devices, such as computerized and electronic devices, but can also be part of a live performance.
Multimedia (as an adjective) also describes electronic media devices used to store and experience multimedia content. Multimedia is distinguished from mixed media in fine art; by including audio, for example, it has a broader scope. The term "rich media" is synonymous for interactive multimedia. Hypermedia can be considered one particular multimedia application.
IMAGE PROCESSING
Image processing is any form of signal processing for which the input is an image, such as photographs or frames of video; the output of image processing can be either an image or a set of characteristics or parameters related to the image. Most image-processing techniques involve treating the image as a two-dimensional signal and applying standard signal-processing techniques to it. Image processing usually refers to digital image processing, but optical and analog image processing are also possible. The acquisition of images (producing the input image in the first place) is referred to as imaging.
Image Processing, IEEE Transactions on
Signal-processing aspects of image processing, imaging systems, and image scanning, display, and printing. Includes theory, algorithms, and architectures for image coding, filtering, enhancement, restoration, segmentation, and motion estimation; image formation in tomography, radar, sonar, geophysics, astronomy, microscopy, and crystallography; image scanning, digital half-toning and display, and color reproduction.
DEPENDABLE AND SECURE COMPUTING
This publication focuses on research into foundations, methodologies, and mechanisms that support the achievement through design, modeling, and evaluation of systems and networks that are dependable and secure to the desired degree without compromising performance. The focus will also include measurement, modeling, and simulation techniques, and foundations for jointly evaluation, verifying, and designing for performance, security, and dependability constraints.
KNOWLEDGE AND DATA ENGINEERING
The IEEE Transactions on Knowledge and Data Engineering is an archival journal published monthly. The information published in this Transactions is designed to inform researchers, developers, managers, strategic planners, users, and others interested in state-of-the-art and state-of-the-practice activities in the knowledge and data engineering area. We are interested in well-defined theoretical results and empirical studies that have potential impact on the acquisition, management, storage, and graceful degeneration of knowledge and data, as well as in provision of knowledge and data services.
Specific topics include, but are not limited to:
a) artificial intelligence techniques, including speech, voice, graphics, images, and documents;
b) knowledge and data engineering tools and techniques;
c) parallel and distributed processing;
d) real-time distributed;
e) system architectures, integration, and modeling;
f) database design, modeling and management;
g) query design and implementation languages;
h) distributed database control;
i) algorithms for data and knowledge management;
j) performance evaluation of algorithms and systems;
k) data communications aspects;
l) system applications and experience;
m) knowledge-based and expert systems; and,
n) integrity, security, and fault tolerance.
PARALLEL AND DISTRIBUTED SYSTEMS
IEEE Transactions on Parallel and Distributed Systems (TPDS) is published monthly. The goal of TPDS is to publish a range of papers, comments on previously published papers, and survey articles that deal with the research areas of current importance to our readers. Current areas of particular interest include, but are not limited to the following:
a) architectures: design, analysis, and implementation of multiple-processor systems (including multi-processors, multicomputers, and networks); impact of VLSI on system design; interprocessor communications;
b) software: parallel languages and compilers; scheduling and task partitioning; databases, operating systems, and programming environments for multiple-processor systems;
c) algorithms and applications: models of computation; analysis and design of parallel/distributed algorithms; application studies resulting in better multiple-processor systems;
d) other issues: performance measurements, evaluation, modeling and simulation of multiple-processor systems; real-time, reliability and fault-tolerance issues; conversion of software from sequential-to-parallel forms.
SOFTWARE ENGINEERING
The IEEE Transactions on Software Engineering is an archival journal published monthly. We are interested in well-defined theoretical results and empirical studies that have potential impact on the construction, analysis, or management of software. The scope of this Transactions ranges from the mechanisms through the development of principles to the application of those principles to specific environments. Since the journal is archival, it is assumed that the ideas presented are important, have been well analyzed, and/or empirically validated and are of value to the software engineering research or practitioner community.
Specific topic areas include:
a) development and maintenance methods and models, e.g., techniques and principles for the specification, design, and implementation of software systems, including notations and process models;
b) assessment methods, e.g., software tests and validation, reliability models, test and diagnosis procedures, software redundancy and design for error control, and the measurements and evaluation of various aspects of the process and product;
c) software project management, e.g., productivity factors, cost models, schedule and organizational issues, standards;
d) tools and environments, e.g., specific tools, integrated tool environments including the associated architectures, databases, and parallel and distributed processing issues;
e) system issues, e.g., hardware-software trade-off; and
f) state-of-the-art surveys that provide a synthesis and comprehensive review of the historical development of one particular area of interest.
2. IEEE Transactions on Computers
3. IEEE Transactions on Data Mining
4. IEEE Transactions on Dependable and Secure Computing
5. IEEE Transactions on Distributed Computing System
6. IEEE Transactions on Image Processing
7. IEEE Transactions on Information Forensics and Security
8. IEEE Transactions on Internet Computing
9. IEEE Transactions on Knowledge and Data Engineering
10. IEEE Transactions on Learning Technologies
11. IEEE Transactions on Mobile Computing
12. IEEE Transactions on Multimedia Computing
13. IEEE Transactions on Network Computing
14. IEEE Transactions on Network Security
15. IEEE Transactions on Parallel and Distributed Systems
16. IEEE Transactions on Pattern Analysis & Machine Intelligence
17. IEEE Transactions on Services Computing
18. IEEE Transactions on Signal Processing
19. IEEE Transactions on Software Engineering
20. IEEE Transactions on Vehicular Technology
21. IEEE Transactions on Visualization and Computer Graphics
22. IEEE Transactions on Web Application and Web Service
23. IEEE Transactions on Wireless Communication
24. IEEE Transactions on Wireless Sensor Networks
25. IEEE Transactions on Fuzzy systems
26. IEEE Transactions on Neural Networks
IEEE - Institute of Electrical and Electronics Engineers
A non-profit organization, IEEE is the world's leading professional association for the advancement of technology.
pronounced - Eye-triple-E
IEEE's Constitution deems the purposes of the organization as "scientific and educational, directed toward the advancement of the theory and practice of electrical, electronics, communications and computer engineering, as well as computer science, the allied branches of engineering and the related arts and sciences."
In implementing these goals, the IEEE serves as a major publisher of scientific journals and a conference organizer. It is also a leading innovator of industrial standards in a broad range of disciplines, including electric power and energy, biomedical technology and healthcare, information technology, information assurance, telecommunications, consumer electronics, transportation, aerospace, and nanotechnology.
IEEE develops and participates in educational activities such as accreditation of electrical engineering programs in institutes of higher learning. The IEEE also serves student members in colleges and universities around the world. Prospective members and organizations purchase IEEE products and participate in conferences or other IEEE programs.
SCOPE OF IEEE:
IEEE Computer Architecture Letters is a bi-annual forum for fast publication of new, high-quality ideas in the form of short, critically refereed, technical papers.
Submissions are accepted on a continuing basis, and accepted letters will be published immediately in the IEEE Digital Library and in the next available print issue. Members of the Technical Committee on Computer Architecture will receive the print issue as a benefit of being a member. Authors should submit their manuscript through Manuscript Central.
JAVA
Java is a programming language originally developed by James Gosling at Sun Microsystems (which is now a subsidiary of Oracle Corporation) and released in 1995 as a core component of Sun Microsystems' Java platform. The language derives much of its syntax from C and C++ but has a simpler object model and fewer low-level facilities. Java applications are typically compiled to bytecode (class file) that can run on any Java Virtual Machine (JVM) regardless of computer architecture. Java is a general-purpose, concurrent, class-based, object-oriented language that is specifically designed to have as few implementation dependencies as possible. It is intended to let application developers "write once, run anywhere." Java is currently one of the most popular programming languages in use, particularly for client-server web applications.
The original and reference implementation Java compilers, virtual machines, and class libraries were developed by Sun from 1995. As of May 2007, in compliance with the specifications of the Java Community Process, Sun relicensed most of its Java technologies under the GNU General Public License. Others have also developed alternative implementations of these Sun technologies, such as the GNU Compiler for Java, GNU Classpath, and Dalvik.
James Gosling, Patrick Naughton, Chris Wrath, Ed Frank, and Mike Sheridan conceived Java at Sun Micro system.
It is a platform independent programming language that extends its features wide over the network. Java2 version introduces a new component called “Swing” – is a set of classes that provides more power & flexible components than are possible with AWT. - It’s a lightweight package, as they are not implemented by platform- specific code.
Related classes are contained in javax.swing and its sub packages, such as javax.swing.tree. -Components explained in the Swing have more capabilities than those of AWT. The Java platform differs from most other platforms in that it’s a software-only platform that runs on top of other, hardware-based platforms.
Principles of Java
There were five primary goals in the creation of the Java language
1. It should be "simple, object oriented and familiar".
2. It should be "robust and secure".
3. It should be "architecture neutral and portable".
4. It should execute with "high performance".
5. It should be "interpreted, threaded, and dynamic".
DOTNET
.NET is the Microsoft’s development model in which software becomes platform and device independent and data becomes available over the Internet. The .Net Framework is the infrastructure of .NET. . NET is built from the group up on open architecture. . NET is a platform that can be used for building and running the next generation of Microsoft Windows and Web applications. The goal of the Microsoft .NET platform is to simplify web development.
The .Net Framework provides the foundation upon which application and XML web services are build and executed the unified Nature of the .Net Framework means that all applications, whether they are windows applications, web applications are XML web services are developer by using a common set tools and code, and are easily integrated with one another. Benefits of using .NET Framework:
The benefits of using the .Net Framework for developing application include:
· Based on Web standards and practices The .Net framework fully supports existing Internet technologies, including HTML, HTTP, XML, SOAP and other Web standards.
· Design using unified application models The functionality of a .Net class is available from any .Net compatible languages are programming model. Therefore, the same piece of code can be used by windows applications, web applications and XML web services.
Easy for developers to use The .NET Framework provides the unified type system, which can be used by any .NET-compatible language. In the unified type system, all language elements are objects. These objects can be used by any .Net applications written in any .NET-based language.
· Extensible classes The hierarchy of the .Net Framework is not hidden from the developer. You can access and extend .Net classes through inheritance
MOBILE COMPUTING
Mobile computing is a generic term describing one's ability to use technology while moving, as opposed to portable computers, which are only practical for use while deployed in a stationary configuration. Mobile internet access is generally slower than direct cable connections, using technologies such as GPRS and EDGE, and more recently 3G networks. These networks are usually available within range of commercial cell phone towers. Higher speed wireless LANs are inexpensive, but have very limited range.
Mobile Computing, IEEE Transactions on
The IEEE Transactions on Mobile Computing focuses on the key technical issues related to (a) architectures, (b) support services, (c) algorithm/protocol design and analysis, (d) mobile environments, (e) mobile communication systems, (f) applications, (g) components, including devices, hardware, and software, (h) implementation issues, including interference, power, and software constraints of mobile devices,
NETWORKING
A network is a group of interconnected computers. Networks may be classified according to a wide variety of characteristics. This article provides a general overview of some types and categories and also presents the basic components of a network. A computer network allows computers to communicate with each other and to share resources and information.
This publication is devoted to the timely release of high quality papers that advance the state-of-the-art and practical applications of computer networks, this journal covers such topics as: network architecture and design, communication protocols, network software, network technologies, network services and applications, and network operations management.
DATA SECURITY
Data security is the means of ensuring that data is kept safe from corruption and that access to it is suitably controlled. Thus data security helps to ensure privacy. It also helps in protecting personal data. Software based security solutions encrypt the data to prevent data from being stolen. However, a malicious program or a hacker may corrupt the data in order to make it unrecoverable or unusable. Similarly, encrypted operating systems can be corrupted by a malicious program or a hacker, making the system unusable.
Hardware-based security solutions can prevent read and write access to data and hence offers very strong protection against tampering and unauthorized access. Hardware based or assisted computer security offers an alternative to software-only computer security. Security tokens such as those using PKCS#11 may be more secure due to the physical access required in order to be compromised. Access is enabled only when the token is connected and correct PIN is entered.
CLOUD COMPUTING
Cloud computing is a style of computing in which dynamically scalable and often virtualized resources are provided as a service over the Internet. Users need not have knowledge of, expertise in, or control over the technology infrastructure in the "cloud" that supports them.
The concept generally incorporates combinations of the following:
• infrastructure as a service (IaaS)
• platform as a service (PaaS)
• software as a service (SaaS) Cloud computing customers do not generally own the physical infrastructure serving as host to the software platform in question.
Instead, they avoid capital expenditure by renting usage from a third-party provider. They consume resources as a service and pay only for resources that they use. Many cloud-computing offerings employ the utility computing model, which is analogous to how traditional utility services (such as electricity) are consumed.
GRID COMPUTING
Grid computing (or the use of computational grids) is the combination of computer resources from multiple administrative domains applied to a common task, usually to a scientific, technical or business problem that requires a great number of computer processing cycles or the need to process large amounts of data. One of the main strategies of grid computing is using software to divide and apportion pieces of a program among several computers, sometimes up to many thousands.
Grid computing is distributed, large-scale cluster computing, as well as a form of network-distributed parallel processing. The size of grid computing may vary from being small confined to a network of computer workstations within a corporation, for example to being large, public collaboration across many companies and networks.
It is a form of distributed computing whereby a “super and virtual computer” is composed of a cluster of networked loosely coupled computers acting in concert to perform very large tasks. This technology has been applied to computationally intensive scientific, mathematical, and academic problems through volunteer computing, and it is used in commercial enterprises for such diverse applications as drug discovery, economic forecasting, seismic analysis, and back-office data processing in support of e-commerce and Web services.
DATA MINING
Data mining is the process of extracting patterns from data. As more data are gathered, with the amount of data doubling every three years, data mining is becoming an increasingly important tool to transform these data into information. It is commonly used in a wide range of profiling practices, such as marketing, surveillance, fraud detection and scientific discovery. Data mining commonly involves four classes of task: • Classification - Arranges the data into predefined groups.
For example an email program might attempt to classify an email as legitimate or spam. Common algorithms include Nearest neighbor, Naive Bayes classifier and Neural network.
• Clustering - Is like classification but the groups are not predefined, so the algorithm will try to group similar items together.
• Regression - Attempts to find a function, which models the data with the least error. A common method is to use Genetic Programming.
• Association rule learning - Searches for relationships between variables. For example a supermarket might gather data of what each customer buys.
Using association rule learning, the supermarket can work out what products are frequently bought together, which is useful for marketing purposes. This is sometimes referred to as "market basket analysis".
MULTIMEDIA
Multimedia is media and content that uses a combination of different content forms. The term can be used as a noun (a medium with multiple content forms) or as an adjective describing a medium as having multiple content forms. The term is used in contrast to media which only use traditional forms of printed or hand-produced material.
Multimedia includes a combination of text, audio, still images, animation, video, and interactivity content forms. Multimedia is usually recorded and played, displayed or accessed by information content processing devices, such as computerized and electronic devices, but can also be part of a live performance.
Multimedia (as an adjective) also describes electronic media devices used to store and experience multimedia content. Multimedia is distinguished from mixed media in fine art; by including audio, for example, it has a broader scope. The term "rich media" is synonymous for interactive multimedia. Hypermedia can be considered one particular multimedia application.
IMAGE PROCESSING
Image processing is any form of signal processing for which the input is an image, such as photographs or frames of video; the output of image processing can be either an image or a set of characteristics or parameters related to the image. Most image-processing techniques involve treating the image as a two-dimensional signal and applying standard signal-processing techniques to it. Image processing usually refers to digital image processing, but optical and analog image processing are also possible. The acquisition of images (producing the input image in the first place) is referred to as imaging.
Image Processing, IEEE Transactions on
Signal-processing aspects of image processing, imaging systems, and image scanning, display, and printing. Includes theory, algorithms, and architectures for image coding, filtering, enhancement, restoration, segmentation, and motion estimation; image formation in tomography, radar, sonar, geophysics, astronomy, microscopy, and crystallography; image scanning, digital half-toning and display, and color reproduction.
DEPENDABLE AND SECURE COMPUTING
This publication focuses on research into foundations, methodologies, and mechanisms that support the achievement through design, modeling, and evaluation of systems and networks that are dependable and secure to the desired degree without compromising performance. The focus will also include measurement, modeling, and simulation techniques, and foundations for jointly evaluation, verifying, and designing for performance, security, and dependability constraints.
KNOWLEDGE AND DATA ENGINEERING
The IEEE Transactions on Knowledge and Data Engineering is an archival journal published monthly. The information published in this Transactions is designed to inform researchers, developers, managers, strategic planners, users, and others interested in state-of-the-art and state-of-the-practice activities in the knowledge and data engineering area. We are interested in well-defined theoretical results and empirical studies that have potential impact on the acquisition, management, storage, and graceful degeneration of knowledge and data, as well as in provision of knowledge and data services.
Specific topics include, but are not limited to:
a) artificial intelligence techniques, including speech, voice, graphics, images, and documents;
b) knowledge and data engineering tools and techniques;
c) parallel and distributed processing;
d) real-time distributed;
e) system architectures, integration, and modeling;
f) database design, modeling and management;
g) query design and implementation languages;
h) distributed database control;
i) algorithms for data and knowledge management;
j) performance evaluation of algorithms and systems;
k) data communications aspects;
l) system applications and experience;
m) knowledge-based and expert systems; and,
n) integrity, security, and fault tolerance.
PARALLEL AND DISTRIBUTED SYSTEMS
IEEE Transactions on Parallel and Distributed Systems (TPDS) is published monthly. The goal of TPDS is to publish a range of papers, comments on previously published papers, and survey articles that deal with the research areas of current importance to our readers. Current areas of particular interest include, but are not limited to the following:
a) architectures: design, analysis, and implementation of multiple-processor systems (including multi-processors, multicomputers, and networks); impact of VLSI on system design; interprocessor communications;
b) software: parallel languages and compilers; scheduling and task partitioning; databases, operating systems, and programming environments for multiple-processor systems;
c) algorithms and applications: models of computation; analysis and design of parallel/distributed algorithms; application studies resulting in better multiple-processor systems;
d) other issues: performance measurements, evaluation, modeling and simulation of multiple-processor systems; real-time, reliability and fault-tolerance issues; conversion of software from sequential-to-parallel forms.
SOFTWARE ENGINEERING
The IEEE Transactions on Software Engineering is an archival journal published monthly. We are interested in well-defined theoretical results and empirical studies that have potential impact on the construction, analysis, or management of software. The scope of this Transactions ranges from the mechanisms through the development of principles to the application of those principles to specific environments. Since the journal is archival, it is assumed that the ideas presented are important, have been well analyzed, and/or empirically validated and are of value to the software engineering research or practitioner community.
Specific topic areas include:
a) development and maintenance methods and models, e.g., techniques and principles for the specification, design, and implementation of software systems, including notations and process models;
b) assessment methods, e.g., software tests and validation, reliability models, test and diagnosis procedures, software redundancy and design for error control, and the measurements and evaluation of various aspects of the process and product;
c) software project management, e.g., productivity factors, cost models, schedule and organizational issues, standards;
d) tools and environments, e.g., specific tools, integrated tool environments including the associated architectures, databases, and parallel and distributed processing issues;
e) system issues, e.g., hardware-software trade-off; and
f) state-of-the-art surveys that provide a synthesis and comprehensive review of the historical development of one particular area of interest.