Help | Advanced Search

Computer Science (since January 1993)

For a specific paper , enter the identifier into the top right search box.

  • new (most recent mailing, with abstracts)
  • recent (last 5 mailings)
  • current month's cs listings
  • specific year/month: 2024 2023 2022 2021 2020 2019 2018 2017 2016 2015 2014 2013 2012 2011 2010 2009 2008 2007 2006 2005 2004 2003 2002 2001 2000 1999 1998 1997 1996 1995 1994 1993 all months 01 (Jan) 02 (Feb) 03 (Mar) 04 (Apr) 05 (May) 06 (Jun) 07 (Jul) 08 (Aug) 09 (Sep) 10 (Oct) 11 (Nov) 12 (Dec)
  • Catch-up: Changes since: 01 02 03 04 05 06 07 08 09 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 01 (Jan) 02 (Feb) 03 (Mar) 04 (Apr) 05 (May) 06 (Jun) 07 (Jul) 08 (Aug) 09 (Sep) 10 (Oct) 11 (Nov) 12 (Dec) 2024 2023 2022 2021 2020 2019 2018 2017 2016 2015 2014 2013 2012 2011 2010 2009 2008 2007 2006 2005 2004 2003 2002 2001 2000 1999 1998 1997 1996 1995 1994 1993 , view results without with abstracts
  • Search within the cs archive
  • Article statistics by year: 2024 2023 2022 2021 2020 2019 2018 2017 2016 2015 2014 2013 2012 2011 2010 2009 2008 2007 2006 2005 2004 2003 2002 2001 2000 1999 1998 1997 1996 1995 1994 1993

Categories within Computer Science

  • cs.AI - Artificial Intelligence ( new , recent , current month ) Covers all areas of AI except Vision, Robotics, Machine Learning, Multiagent Systems, and Computation and Language (Natural Language Processing), which have separate subject areas. In particular, includes Expert Systems, Theorem Proving (although this may overlap with Logic in Computer Science), Knowledge Representation, Planning, and Uncertainty in AI. Roughly includes material in ACM Subject Classes I.2.0, I.2.1, I.2.3, I.2.4, I.2.8, and I.2.11.
  • cs.CL - Computation and Language ( new , recent , current month ) Covers natural language processing. Roughly includes material in ACM Subject Class I.2.7. Note that work on artificial languages (programming languages, logics, formal systems) that does not explicitly address natural-language issues broadly construed (natural-language processing, computational linguistics, speech, text retrieval, etc.) is not appropriate for this area.
  • cs.CC - Computational Complexity ( new , recent , current month ) Covers models of computation, complexity classes, structural complexity, complexity tradeoffs, upper and lower bounds. Roughly includes material in ACM Subject Classes F.1 (computation by abstract devices), F.2.3 (tradeoffs among complexity measures), and F.4.3 (formal languages), although some material in formal languages may be more appropriate for Logic in Computer Science. Some material in F.2.1 and F.2.2, may also be appropriate here, but is more likely to have Data Structures and Algorithms as the primary subject area.
  • cs.CE - Computational Engineering, Finance, and Science ( new , recent , current month ) Covers applications of computer science to the mathematical modeling of complex systems in the fields of science, engineering, and finance. Papers here are interdisciplinary and applications-oriented, focusing on techniques and tools that enable challenging computational simulations to be performed, for which the use of supercomputers or distributed computing platforms is often required. Includes material in ACM Subject Classes J.2, J.3, and J.4 (economics).
  • cs.CG - Computational Geometry ( new , recent , current month ) Roughly includes material in ACM Subject Classes I.3.5 and F.2.2.
  • cs.GT - Computer Science and Game Theory ( new , recent , current month ) Covers all theoretical and applied aspects at the intersection of computer science and game theory, including work in mechanism design, learning in games (which may overlap with Learning), foundations of agent modeling in games (which may overlap with Multiagent systems), coordination, specification and formal methods for non-cooperative computational environments. The area also deals with applications of game theory to areas such as electronic commerce.
  • cs.CV - Computer Vision and Pattern Recognition ( new , recent , current month ) Covers image processing, computer vision, pattern recognition, and scene understanding. Roughly includes material in ACM Subject Classes I.2.10, I.4, and I.5.
  • cs.CY - Computers and Society ( new , recent , current month ) Covers impact of computers on society, computer ethics, information technology and public policy, legal aspects of computing, computers and education. Roughly includes material in ACM Subject Classes K.0, K.2, K.3, K.4, K.5, and K.7.
  • cs.CR - Cryptography and Security ( new , recent , current month ) Covers all areas of cryptography and security including authentication, public key cryptosytems, proof-carrying code, etc. Roughly includes material in ACM Subject Classes D.4.6 and E.3.
  • cs.DS - Data Structures and Algorithms ( new , recent , current month ) Covers data structures and analysis of algorithms. Roughly includes material in ACM Subject Classes E.1, E.2, F.2.1, and F.2.2.
  • cs.DB - Databases ( new , recent , current month ) Covers database management, datamining, and data processing. Roughly includes material in ACM Subject Classes E.2, E.5, H.0, H.2, and J.1.
  • cs.DL - Digital Libraries ( new , recent , current month ) Covers all aspects of the digital library design and document and text creation. Note that there will be some overlap with Information Retrieval (which is a separate subject area). Roughly includes material in ACM Subject Classes H.3.5, H.3.6, H.3.7, I.7.
  • cs.DM - Discrete Mathematics ( new , recent , current month ) Covers combinatorics, graph theory, applications of probability. Roughly includes material in ACM Subject Classes G.2 and G.3.
  • cs.DC - Distributed, Parallel, and Cluster Computing ( new , recent , current month ) Covers fault-tolerance, distributed algorithms, stabilility, parallel computation, and cluster computing. Roughly includes material in ACM Subject Classes C.1.2, C.1.4, C.2.4, D.1.3, D.4.5, D.4.7, E.1.
  • cs.ET - Emerging Technologies ( new , recent , current month ) Covers approaches to information processing (computing, communication, sensing) and bio-chemical analysis based on alternatives to silicon CMOS-based technologies, such as nanoscale electronic, photonic, spin-based, superconducting, mechanical, bio-chemical and quantum technologies (this list is not exclusive). Topics of interest include (1) building blocks for emerging technologies, their scalability and adoption in larger systems, including integration with traditional technologies, (2) modeling, design and optimization of novel devices and systems, (3) models of computation, algorithm design and programming for emerging technologies.
  • cs.FL - Formal Languages and Automata Theory ( new , recent , current month ) Covers automata theory, formal language theory, grammars, and combinatorics on words. This roughly corresponds to ACM Subject Classes F.1.1, and F.4.3. Papers dealing with computational complexity should go to cs.CC; papers dealing with logic should go to cs.LO.
  • cs.GL - General Literature ( new , recent , current month ) Covers introductory material, survey material, predictions of future trends, biographies, and miscellaneous computer-science related material. Roughly includes all of ACM Subject Class A, except it does not include conference proceedings (which will be listed in the appropriate subject area).
  • cs.GR - Graphics ( new , recent , current month ) Covers all aspects of computer graphics. Roughly includes material in all of ACM Subject Class I.3, except that I.3.5 is is likely to have Computational Geometry as the primary subject area.
  • cs.AR - Hardware Architecture ( new , recent , current month ) Covers systems organization and hardware architecture. Roughly includes material in ACM Subject Classes C.0, C.1, and C.5.
  • cs.HC - Human-Computer Interaction ( new , recent , current month ) Covers human factors, user interfaces, and collaborative computing. Roughly includes material in ACM Subject Classes H.1.2 and all of H.5, except for H.5.1, which is more likely to have Multimedia as the primary subject area.
  • cs.IR - Information Retrieval ( new , recent , current month ) Covers indexing, dictionaries, retrieval, content and analysis. Roughly includes material in ACM Subject Classes H.3.0, H.3.1, H.3.2, H.3.3, and H.3.4.
  • cs.IT - Information Theory ( new , recent , current month ) Covers theoretical and experimental aspects of information theory and coding. Includes material in ACM Subject Class E.4 and intersects with H.1.1.
  • cs.LO - Logic in Computer Science ( new , recent , current month ) Covers all aspects of logic in computer science, including finite model theory, logics of programs, modal logic, and program verification. Programming language semantics should have Programming Languages as the primary subject area. Roughly includes material in ACM Subject Classes D.2.4, F.3.1, F.4.0, F.4.1, and F.4.2; some material in F.4.3 (formal languages) may also be appropriate here, although Computational Complexity is typically the more appropriate subject area.
  • cs.LG - Machine Learning ( new , recent , current month ) Papers on all aspects of machine learning research (supervised, unsupervised, reinforcement learning, bandit problems, and so on) including also robustness, explanation, fairness, and methodology. cs.LG is also an appropriate primary category for applications of machine learning methods.
  • cs.MS - Mathematical Software ( new , recent , current month ) Roughly includes material in ACM Subject Class G.4.
  • cs.MA - Multiagent Systems ( new , recent , current month ) Covers multiagent systems, distributed artificial intelligence, intelligent agents, coordinated interactions. and practical applications. Roughly covers ACM Subject Class I.2.11.
  • cs.MM - Multimedia ( new , recent , current month ) Roughly includes material in ACM Subject Class H.5.1.
  • cs.NI - Networking and Internet Architecture ( new , recent , current month ) Covers all aspects of computer communication networks, including network architecture and design, network protocols, and internetwork standards (like TCP/IP). Also includes topics, such as web caching, that are directly relevant to Internet architecture and performance. Roughly includes all of ACM Subject Class C.2 except C.2.4, which is more likely to have Distributed, Parallel, and Cluster Computing as the primary subject area.
  • cs.NE - Neural and Evolutionary Computing ( new , recent , current month ) Covers neural networks, connectionism, genetic algorithms, artificial life, adaptive behavior. Roughly includes some material in ACM Subject Class C.1.3, I.2.6, I.5.
  • cs.NA - Numerical Analysis ( new , recent , current month ) cs.NA is an alias for math.NA. Roughly includes material in ACM Subject Class G.1.
  • cs.OS - Operating Systems ( new , recent , current month ) Roughly includes material in ACM Subject Classes D.4.1, D.4.2., D.4.3, D.4.4, D.4.5, D.4.7, and D.4.9.
  • cs.OH - Other Computer Science ( new , recent , current month ) This is the classification to use for documents that do not fit anywhere else.
  • cs.PF - Performance ( new , recent , current month ) Covers performance measurement and evaluation, queueing, and simulation. Roughly includes material in ACM Subject Classes D.4.8 and K.6.2.
  • cs.PL - Programming Languages ( new , recent , current month ) Covers programming language semantics, language features, programming approaches (such as object-oriented programming, functional programming, logic programming). Also includes material on compilers oriented towards programming languages; other material on compilers may be more appropriate in Architecture (AR). Roughly includes material in ACM Subject Classes D.1 and D.3.
  • cs.RO - Robotics ( new , recent , current month ) Roughly includes material in ACM Subject Class I.2.9.
  • cs.SI - Social and Information Networks ( new , recent , current month ) Covers the design, analysis, and modeling of social and information networks, including their applications for on-line information access, communication, and interaction, and their roles as datasets in the exploration of questions in these and other domains, including connections to the social and biological sciences. Analysis and modeling of such networks includes topics in ACM Subject classes F.2, G.2, G.3, H.2, and I.2; applications in computing include topics in H.3, H.4, and H.5; and applications at the interface of computing and other disciplines include topics in J.1--J.7. Papers on computer communication systems and network protocols (e.g. TCP/IP) are generally a closer fit to the Networking and Internet Architecture (cs.NI) category.
  • cs.SE - Software Engineering ( new , recent , current month ) Covers design tools, software metrics, testing and debugging, programming environments, etc. Roughly includes material in all of ACM Subject Classes D.2, except that D.2.4 (program verification) should probably have Logics in Computer Science as the primary subject area.
  • cs.SD - Sound ( new , recent , current month ) Covers all aspects of computing with sound, and sound as an information channel. Includes models of sound, analysis and synthesis, audio user interfaces, sonification of data, computer music, and sound signal processing. Includes ACM Subject Class H.5.5, and intersects with H.1.2, H.5.1, H.5.2, I.2.7, I.5.4, I.6.3, J.5, K.4.2.
  • cs.SC - Symbolic Computation ( new , recent , current month ) Roughly includes material in ACM Subject Class I.1.
  • cs.SY - Systems and Control ( new , recent , current month ) cs.SY is an alias for eess.SY. This section includes theoretical and experimental research covering all facets of automatic control systems. The section is focused on methods of control system analysis and design using tools of modeling, simulation and optimization. Specific areas of research include nonlinear, distributed, adaptive, stochastic and robust control in addition to hybrid and discrete event systems. Application areas include automotive and aerospace control systems, network control, biological systems, multiagent and cooperative control, robotics, reinforcement learning, sensor networks, control of cyber-physical and energy-related systems, and control of computing systems.

Grad Coach

Research Topics & Ideas: CompSci & IT

50+ Computer Science Research Topic Ideas To Fast-Track Your Project

IT & Computer Science Research Topics

Finding and choosing a strong research topic is the critical first step when it comes to crafting a high-quality dissertation, thesis or research project. If you’ve landed on this post, chances are you’re looking for a computer science-related research topic , but aren’t sure where to start. Here, we’ll explore a variety of CompSci & IT-related research ideas and topic thought-starters, including algorithms, AI, networking, database systems, UX, information security and software engineering.

NB – This is just the start…

The topic ideation and evaluation process has multiple steps . In this post, we’ll kickstart the process by sharing some research topic ideas within the CompSci domain. This is the starting point, but to develop a well-defined research topic, you’ll need to identify a clear and convincing research gap , along with a well-justified plan of action to fill that gap.

If you’re new to the oftentimes perplexing world of research, or if this is your first time undertaking a formal academic research project, be sure to check out our free dissertation mini-course. In it, we cover the process of writing a dissertation or thesis from start to end. Be sure to also sign up for our free webinar that explores how to find a high-quality research topic. 

Overview: CompSci Research Topics

  • Algorithms & data structures
  • Artificial intelligence ( AI )
  • Computer networking
  • Database systems
  • Human-computer interaction
  • Information security (IS)
  • Software engineering
  • Examples of CompSci dissertation & theses

Topics/Ideas: Algorithms & Data Structures

  • An analysis of neural network algorithms’ accuracy for processing consumer purchase patterns
  • A systematic review of the impact of graph algorithms on data analysis and discovery in social media network analysis
  • An evaluation of machine learning algorithms used for recommender systems in streaming services
  • A review of approximation algorithm approaches for solving NP-hard problems
  • An analysis of parallel algorithms for high-performance computing of genomic data
  • The influence of data structures on optimal algorithm design and performance in Fintech
  • A Survey of algorithms applied in internet of things (IoT) systems in supply-chain management
  • A comparison of streaming algorithm performance for the detection of elephant flows
  • A systematic review and evaluation of machine learning algorithms used in facial pattern recognition
  • Exploring the performance of a decision tree-based approach for optimizing stock purchase decisions
  • Assessing the importance of complete and representative training datasets in Agricultural machine learning based decision making.
  • A Comparison of Deep learning algorithms performance for structured and unstructured datasets with “rare cases”
  • A systematic review of noise reduction best practices for machine learning algorithms in geoinformatics.
  • Exploring the feasibility of applying information theory to feature extraction in retail datasets.
  • Assessing the use case of neural network algorithms for image analysis in biodiversity assessment

Topics & Ideas: Artificial Intelligence (AI)

  • Applying deep learning algorithms for speech recognition in speech-impaired children
  • A review of the impact of artificial intelligence on decision-making processes in stock valuation
  • An evaluation of reinforcement learning algorithms used in the production of video games
  • An exploration of key developments in natural language processing and how they impacted the evolution of Chabots.
  • An analysis of the ethical and social implications of artificial intelligence-based automated marking
  • The influence of large-scale GIS datasets on artificial intelligence and machine learning developments
  • An examination of the use of artificial intelligence in orthopaedic surgery
  • The impact of explainable artificial intelligence (XAI) on transparency and trust in supply chain management
  • An evaluation of the role of artificial intelligence in financial forecasting and risk management in cryptocurrency
  • A meta-analysis of deep learning algorithm performance in predicting and cyber attacks in schools

Research topic idea mega list

Topics & Ideas: Networking

  • An analysis of the impact of 5G technology on internet penetration in rural Tanzania
  • Assessing the role of software-defined networking (SDN) in modern cloud-based computing
  • A critical analysis of network security and privacy concerns associated with Industry 4.0 investment in healthcare.
  • Exploring the influence of cloud computing on security risks in fintech.
  • An examination of the use of network function virtualization (NFV) in telecom networks in Southern America
  • Assessing the impact of edge computing on network architecture and design in IoT-based manufacturing
  • An evaluation of the challenges and opportunities in 6G wireless network adoption
  • The role of network congestion control algorithms in improving network performance on streaming platforms
  • An analysis of network coding-based approaches for data security
  • Assessing the impact of network topology on network performance and reliability in IoT-based workspaces

Free Webinar: How To Find A Dissertation Research Topic

Topics & Ideas: Database Systems

  • An analysis of big data management systems and technologies used in B2B marketing
  • The impact of NoSQL databases on data management and analysis in smart cities
  • An evaluation of the security and privacy concerns of cloud-based databases in financial organisations
  • Exploring the role of data warehousing and business intelligence in global consultancies
  • An analysis of the use of graph databases for data modelling and analysis in recommendation systems
  • The influence of the Internet of Things (IoT) on database design and management in the retail grocery industry
  • An examination of the challenges and opportunities of distributed databases in supply chain management
  • Assessing the impact of data compression algorithms on database performance and scalability in cloud computing
  • An evaluation of the use of in-memory databases for real-time data processing in patient monitoring
  • Comparing the effects of database tuning and optimization approaches in improving database performance and efficiency in omnichannel retailing

Topics & Ideas: Human-Computer Interaction

  • An analysis of the impact of mobile technology on human-computer interaction prevalence in adolescent men
  • An exploration of how artificial intelligence is changing human-computer interaction patterns in children
  • An evaluation of the usability and accessibility of web-based systems for CRM in the fast fashion retail sector
  • Assessing the influence of virtual and augmented reality on consumer purchasing patterns
  • An examination of the use of gesture-based interfaces in architecture
  • Exploring the impact of ease of use in wearable technology on geriatric user
  • Evaluating the ramifications of gamification in the Metaverse
  • A systematic review of user experience (UX) design advances associated with Augmented Reality
  • A comparison of natural language processing algorithms automation of customer response Comparing end-user perceptions of natural language processing algorithms for automated customer response
  • Analysing the impact of voice-based interfaces on purchase practices in the fast food industry

Research Topic Kickstarter - Need Help Finding A Research Topic?

Topics & Ideas: Information Security

  • A bibliometric review of current trends in cryptography for secure communication
  • An analysis of secure multi-party computation protocols and their applications in cloud-based computing
  • An investigation of the security of blockchain technology in patient health record tracking
  • A comparative study of symmetric and asymmetric encryption algorithms for instant text messaging
  • A systematic review of secure data storage solutions used for cloud computing in the fintech industry
  • An analysis of intrusion detection and prevention systems used in the healthcare sector
  • Assessing security best practices for IoT devices in political offices
  • An investigation into the role social media played in shifting regulations related to privacy and the protection of personal data
  • A comparative study of digital signature schemes adoption in property transfers
  • An assessment of the security of secure wireless communication systems used in tertiary institutions

Topics & Ideas: Software Engineering

  • A study of agile software development methodologies and their impact on project success in pharmacology
  • Investigating the impacts of software refactoring techniques and tools in blockchain-based developments
  • A study of the impact of DevOps practices on software development and delivery in the healthcare sector
  • An analysis of software architecture patterns and their impact on the maintainability and scalability of cloud-based offerings
  • A study of the impact of artificial intelligence and machine learning on software engineering practices in the education sector
  • An investigation of software testing techniques and methodologies for subscription-based offerings
  • A review of software security practices and techniques for protecting against phishing attacks from social media
  • An analysis of the impact of cloud computing on the rate of software development and deployment in the manufacturing sector
  • Exploring the impact of software development outsourcing on project success in multinational contexts
  • An investigation into the effect of poor software documentation on app success in the retail sector

CompSci & IT Dissertations/Theses

While the ideas we’ve presented above are a decent starting point for finding a CompSci-related research topic, they are fairly generic and non-specific. So, it helps to look at actual dissertations and theses to see how this all comes together.

Below, we’ve included a selection of research projects from various CompSci-related degree programs to help refine your thinking. These are actual dissertations and theses, written as part of Master’s and PhD-level programs, so they can provide some useful insight as to what a research topic looks like in practice.

  • An array-based optimization framework for query processing and data analytics (Chen, 2021)
  • Dynamic Object Partitioning and replication for cooperative cache (Asad, 2021)
  • Embedding constructural documentation in unit tests (Nassif, 2019)
  • PLASA | Programming Language for Synchronous Agents (Kilaru, 2019)
  • Healthcare Data Authentication using Deep Neural Network (Sekar, 2020)
  • Virtual Reality System for Planetary Surface Visualization and Analysis (Quach, 2019)
  • Artificial neural networks to predict share prices on the Johannesburg stock exchange (Pyon, 2021)
  • Predicting household poverty with machine learning methods: the case of Malawi (Chinyama, 2022)
  • Investigating user experience and bias mitigation of the multi-modal retrieval of historical data (Singh, 2021)
  • Detection of HTTPS malware traffic without decryption (Nyathi, 2022)
  • Redefining privacy: case study of smart health applications (Al-Zyoud, 2019)
  • A state-based approach to context modeling and computing (Yue, 2019)
  • A Novel Cooperative Intrusion Detection System for Mobile Ad Hoc Networks (Solomon, 2019)
  • HRSB-Tree for Spatio-Temporal Aggregates over Moving Regions (Paduri, 2019)

Looking at these titles, you can probably pick up that the research topics here are quite specific and narrowly-focused , compared to the generic ones presented earlier. This is an important thing to keep in mind as you develop your own research topic. That is to say, to create a top-notch research topic, you must be precise and target a specific context with specific variables of interest . In other words, you need to identify a clear, well-justified research gap.

Fast-Track Your Research Topic

If you’re still feeling a bit unsure about how to find a research topic for your Computer Science dissertation or research project, check out our Topic Kickstarter service.

You Might Also Like:

Research topics and ideas about data science and big data analytics

Investigating the impacts of software refactoring techniques and tools in blockchain-based developments.

Steps on getting this project topic

Joseph

I want to work with this topic, am requesting materials to guide.

Yadessa Dugassa

Information Technology -MSc program

Andrew Itodo

It’s really interesting but how can I have access to the materials to guide me through my work?

Sorie A. Turay

That’s my problem also.

kumar

Investigating the impacts of software refactoring techniques and tools in blockchain-based developments is in my favour. May i get the proper material about that ?

BEATRICE OSAMEGBE

BLOCKCHAIN TECHNOLOGY

Nanbon Temasgen

I NEED TOPIC

Submit a Comment Cancel reply

Your email address will not be published. Required fields are marked *

Save my name, email, and website in this browser for the next time I comment.

  • Print Friendly

Digital Commons @ University of South Florida

  • USF Research
  • USF Libraries

Digital Commons @ USF > College of Engineering > Computer Science and Engineering > Theses and Dissertations

Computer Science and Engineering Theses and Dissertations

Theses/dissertations from 2023 2023.

Refining the Machine Learning Pipeline for US-based Public Transit Systems , Jennifer Adorno

Insect Classification and Explainability from Image Data via Deep Learning Techniques , Tanvir Hossain Bhuiyan

Brain-Inspired Spatio-Temporal Learning with Application to Robotics , Thiago André Ferreira Medeiros

Evaluating Methods for Improving DNN Robustness Against Adversarial Attacks , Laureano Griffin

Analyzing Multi-Robot Leader-Follower Formations in Obstacle-Laden Environments , Zachary J. Hinnen

Secure Lightweight Cryptographic Hardware Constructions for Deeply Embedded Systems , Jasmin Kaur

A Psychometric Analysis of Natural Language Inference Using Transformer Language Models , Antonio Laverghetta Jr.

Graph Analysis on Social Networks , Shen Lu

Deep Learning-based Automatic Stereology for High- and Low-magnification Images , Hunter Morera

Deciphering Trends and Tactics: Data-driven Techniques for Forecasting Information Spread and Detecting Coordinated Campaigns in Social Media , Kin Wai Ng Lugo

Automated Approaches to Enable Innovative Civic Applications from Citizen Generated Imagery , Hye Seon Yi

Theses/Dissertations from 2022 2022

Towards High Performing and Reliable Deep Convolutional Neural Network Models for Typically Limited Medical Imaging Datasets , Kaoutar Ben Ahmed

Task Progress Assessment and Monitoring Using Self-Supervised Learning , Sainath Reddy Bobbala

Towards More Task-Generalized and Explainable AI Through Psychometrics , Alec Braynen

A Multiple Input Multiple Output Framework for the Automatic Optical Fractionator-based Cell Counting in Z-Stacks Using Deep Learning , Palak Dave

On the Reliability of Wearable Sensors for Assessing Movement Disorder-Related Gait Quality and Imbalance: A Case Study of Multiple Sclerosis , Steven Díaz Hernández

Securing Critical Cyber Infrastructures and Functionalities via Machine Learning Empowered Strategies , Tao Hou

Social Media Time Series Forecasting and User-Level Activity Prediction with Gradient Boosting, Deep Learning, and Data Augmentation , Fred Mubang

A Study of Deep Learning Silhouette Extractors for Gait Recognition , Sneha Oladhri

Analyzing Decision-making in Robot Soccer for Attacking Behaviors , Justin Rodney

Generative Spatio-Temporal and Multimodal Analysis of Neonatal Pain , Md Sirajus Salekin

Secure Hardware Constructions for Fault Detection of Lattice-based Post-quantum Cryptosystems , Ausmita Sarker

Adaptive Multi-scale Place Cell Representations and Replay for Spatial Navigation and Learning in Autonomous Robots , Pablo Scleidorovich

Predicting the Number of Objects in a Robotic Grasp , Utkarsh Tamrakar

Humanoid Robot Motion Control for Ramps and Stairs , Tommy Truong

Preventing Variadic Function Attacks Through Argument Width Counting , Brennan Ward

Theses/Dissertations from 2021 2021

Knowledge Extraction and Inference Based on Visual Understanding of Cooking Contents , Ahmad Babaeian Babaeian Jelodar

Efficient Post-Quantum and Compact Cryptographic Constructions for the Internet of Things , Rouzbeh Behnia

Efficient Hardware Constructions for Error Detection of Post-Quantum Cryptographic Schemes , Alvaro Cintas Canto

Using Hyper-Dimensional Spanning Trees to Improve Structure Preservation During Dimensionality Reduction , Curtis Thomas Davis

Design, Deployment, and Validation of Computer Vision Techniques for Societal Scale Applications , Arup Kanti Dey

AffectiveTDA: Using Topological Data Analysis to Improve Analysis and Explainability in Affective Computing , Hamza Elhamdadi

Automatic Detection of Vehicles in Satellite Images for Economic Monitoring , Cole Hill

Analysis of Contextual Emotions Using Multimodal Data , Saurabh Hinduja

Data-driven Studies on Social Networks: Privacy and Simulation , Yasanka Sameera Horawalavithana

Automated Identification of Stages in Gonotrophic Cycle of Mosquitoes Using Computer Vision Techniques , Sherzod Kariev

Exploring the Use of Neural Transformers for Psycholinguistics , Antonio Laverghetta Jr.

Secure VLSI Hardware Design Against Intellectual Property (IP) Theft and Cryptographic Vulnerabilities , Matthew Dean Lewandowski

Turkic Interlingua: A Case Study of Machine Translation in Low-resource Languages , Jamshidbek Mirzakhalov

Automated Wound Segmentation and Dimension Measurement Using RGB-D Image , Chih-Yun Pai

Constructing Frameworks for Task-Optimized Visualizations , Ghulam Jilani Abdul Rahim Quadri

Trilateration-Based Localization in Known Environments with Object Detection , Valeria M. Salas Pacheco

Recognizing Patterns from Vital Signs Using Spectrograms , Sidharth Srivatsav Sribhashyam

Recognizing Emotion in the Wild Using Multimodal Data , Shivam Srivastava

A Modular Framework for Multi-Rotor Unmanned Aerial Vehicles for Military Operations , Dante Tezza

Human-centered Cybersecurity Research — Anthropological Findings from Two Longitudinal Studies , Anwesh Tuladhar

Learning State-Dependent Sensor Measurement Models To Improve Robot Localization Accuracy , Troi André Williams

Human-centric Cybersecurity Research: From Trapping the Bad Guys to Helping the Good Ones , Armin Ziaie Tabari

Theses/Dissertations from 2020 2020

Classifying Emotions with EEG and Peripheral Physiological Data Using 1D Convolutional Long Short-Term Memory Neural Network , Rupal Agarwal

Keyless Anti-Jamming Communication via Randomized DSSS , Ahmad Alagil

Active Deep Learning Method to Automate Unbiased Stereology Cell Counting , Saeed Alahmari

Composition of Atomic-Obligation Security Policies , Yan Cao Albright

Action Recognition Using the Motion Taxonomy , Maxat Alibayev

Sentiment Analysis in Peer Review , Zachariah J. Beasley

Spatial Heterogeneity Utilization in CT Images for Lung Nodule Classication , Dmitrii Cherezov

Feature Selection Via Random Subsets Of Uncorrelated Features , Long Kim Dang

Unifying Security Policy Enforcement: Theory and Practice , Shamaria Engram

PsiDB: A Framework for Batched Query Processing and Optimization , Mehrad Eslami

Composition of Atomic-Obligation Security Policies , Danielle Ferguson

Algorithms To Profile Driver Behavior From Zero-permission Embedded Sensors , Bharti Goel

The Efficiency and Accuracy of YOLO for Neonate Face Detection in the Clinical Setting , Jacqueline Hausmann

Beyond the Hype: Challenges of Neural Networks as Applied to Social Networks , Anthony Hernandez

Privacy-Preserving and Functional Information Systems , Thang Hoang

Managing Off-Grid Power Use for Solar Fueled Residences with Smart Appliances, Prices-to-Devices and IoT , Donnelle L. January

Novel Bit-Sliced In-Memory Computing Based VLSI Architecture for Fast Sobel Edge Detection in IoT Edge Devices , Rajeev Joshi

Edge Computing for Deep Learning-Based Distributed Real-time Object Detection on IoT Constrained Platforms at Low Frame Rate , Lakshmikavya Kalyanam

Establishing Topological Data Analysis: A Comparison of Visualization Techniques , Tanmay J. Kotha

Machine Learning for the Internet of Things: Applications, Implementation, and Security , Vishalini Laguduva Ramnath

System Support of Concurrent Database Query Processing on a GPU , Hao Li

Deep Learning Predictive Modeling with Data Challenges (Small, Big, or Imbalanced) , Renhao Liu

Countermeasures Against Various Network Attacks Using Machine Learning Methods , Yi Li

Towards Safe Power Oversubscription and Energy Efficiency of Data Centers , Sulav Malla

Design of Support Measures for Counting Frequent Patterns in Graphs , Jinghan Meng

Automating the Classification of Mosquito Specimens Using Image Processing Techniques , Mona Minakshi

Models of Secure Software Enforcement and Development , Hernan M. Palombo

Functional Object-Oriented Network: A Knowledge Representation for Service Robotics , David Andrés Paulius Ramos

Lung Nodule Malignancy Prediction from Computed Tomography Images Using Deep Learning , Rahul Paul

Algorithms and Framework for Computing 2-body Statistics on Graphics Processing Units , Napath Pitaksirianan

Efficient Viewshed Computation Algorithms On GPUs and CPUs , Faisal F. Qarah

Relational Joins on GPUs for In-Memory Database Query Processing , Ran Rui

Micro-architectural Countermeasures for Control Flow and Misspeculation Based Software Attacks , Love Kumar Sah

Efficient Forward-Secure and Compact Signatures for the Internet of Things (IoT) , Efe Ulas Akay Seyitoglu

Detecting Symptoms of Chronic Obstructive Pulmonary Disease and Congestive Heart Failure via Cough and Wheezing Sounds Using Smart-Phones and Machine Learning , Anthony Windmon

Toward Culturally Relevant Emotion Detection Using Physiological Signals , Khadija Zanna

Theses/Dissertations from 2019 2019

Beyond Labels and Captions: Contextualizing Grounded Semantics for Explainable Visual Interpretation , Sathyanarayanan Narasimhan Aakur

Empirical Analysis of a Cybersecurity Scoring System , Jaleel Ahmed

Phenomena of Social Dynamics in Online Games , Essa Alhazmi

A Machine Learning Approach to Predicting Community Engagement on Social Media During Disasters , Adel Alshehri

Interactive Fitness Domains in Competitive Coevolutionary Algorithm , ATM Golam Bari

Measuring Influence Across Social Media Platforms: Empirical Analysis Using Symbolic Transfer Entropy , Abhishek Bhattacharjee

A Communication-Centric Framework for Post-Silicon System-on-chip Integration Debug , Yuting Cao

Authentication and SQL-Injection Prevention Techniques in Web Applications , Cagri Cetin

Multimodal Emotion Recognition Using 3D Facial Landmarks, Action Units, and Physiological Data , Diego Fabiano

Robotic Motion Generation by Using Spatial-Temporal Patterns from Human Demonstrations , Yongqiang Huang

A GPU-Based Framework for Parallel Spatial Indexing and Query Processing , Zhila Nouri Lewis

A Flexible, Natural Deduction, Automated Reasoner for Quick Deployment of Non-Classical Logic , Trisha Mukhopadhyay

An Efficient Run-time CFI Check for Embedded Processors to Detect and Prevent Control Flow Based Attacks , Srivarsha Polnati

Force Feedback and Intelligent Workspace Selection for Legged Locomotion Over Uneven Terrain , John Rippetoe

Detecting Digitally Forged Faces in Online Videos , Neilesh Sambhu

Malicious Manipulation in Service-Oriented Network, Software, and Mobile Systems: Threats and Defenses , Dakun Shen

Advanced Search

  • Email Notifications and RSS
  • All Collections
  • USF Faculty Publications
  • Open Access Journals
  • Conferences and Events
  • Theses and Dissertations
  • Textbooks Collection

Useful Links

  • Rights Information
  • SelectedWorks
  • Submit Research

Home | About | Help | My Account | Accessibility Statement | Language and Diversity Statements

Privacy Copyright

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

  • View all journals

Computer science articles from across Nature Portfolio

Computer science is the study and development of the protocols required for automated processing and manipulation of data. This includes, for example, creating algorithms for efficiently searching large volumes of information or encrypting data so that it can be stored and transmitted securely.

Latest Research and Reviews

computer engineering research paper topics

Co-ordinate-based positional embedding that captures resolution to enhance transformer’s performance in medical image analysis

  • Badhan Kumar Das
  • Gengyan Zhao
  • Andreas Maier

computer engineering research paper topics

A new method based on YOLOv5 and multiscale data augmentation for visual inspection in substation

  • Junjie Chen

computer engineering research paper topics

The benefits, risks and bounds of personalizing the alignment of large language models to individuals

Tailoring the alignment of large language models (LLMs) to individuals is a new frontier in generative AI, but unbounded personalization can bring potential harm, such as large-scale profiling, privacy infringement and bias reinforcement. Kirk et al. develop a taxonomy for risks and benefits of personalized LLMs and discuss the need for normative decisions on what are acceptable bounds of personalization.

  • Hannah Rose Kirk
  • Bertie Vidgen
  • Scott A. Hale

computer engineering research paper topics

Dual-branch feature encoding framework for infrared images super-resolution reconstruction

computer engineering research paper topics

The use of residual analysis to improve the error rate accuracy of machine translation

  • Ľubomír Benko
  • Dasa Munkova

computer engineering research paper topics

SM-CycleGAN: crop image data enhancement method based on self-attention mechanism CycleGAN

  • Dabin Zhang

Advertisement

News and Comment

computer engineering research paper topics

AI now beats humans at basic tasks — new benchmarks are needed, says major report

Stanford University’s 2024 AI Index charts the meteoric rise of artificial-intelligence tools.

  • Nicola Jones

computer engineering research paper topics

Medical artificial intelligence should do no harm

Bias and distrust in medicine have been perpetuated by the misuse of medical equations, algorithms and devices. Artificial intelligence (AI) can exacerbate these problems. However, AI also has potential to detect, mitigate and remedy the harmful effects of bias to build trust and improve healthcare for everyone.

  • Melanie E. Moses
  • Sonia M. Gipson Rankin

computer engineering research paper topics

AI hears hidden X factor in zebra finch love songs

Machine learning detects song differences too subtle for humans to hear, and physicists harness the computing power of the strange skyrmion.

  • Nick Petrić Howe
  • Benjamin Thompson

Three reasons why AI doesn’t model human language

  • Johan J. Bolhuis
  • Stephen Crain
  • Andrea Moro

computer engineering research paper topics

Generative artificial intelligence in chemical engineering

Generative artificial intelligence will transform the way we design and operate chemical processes, argues Artur M. Schweidtmann.

  • Artur M. Schweidtmann

computer engineering research paper topics

Why scientists trust AI too much — and what to do about it

Some researchers see superhuman qualities in artificial intelligence. All scientists need to be alert to the risks this creates.

Quick links

  • Explore articles by subject
  • Guide to authors
  • Editorial policies

computer engineering research paper topics

DigitalCommons@University of Nebraska - Lincoln

Home > Computer Science > CSEARTICLES

Computer Science and Engineering, Department of

School of computing: faculty publications.

Studying Developer Eye Movements to Measure Cognitive Workload and Visual Effort for Expertise Assessment , SALWA D. ALJEHANE, Bonita Sharif, and JONATHAN I. MALETIC

Co-Existence with IEEE 802.11 Networks in the ISM Band Without Channel Estimation , Muhammad Naveed Aman, Muhammad Ishfaq, and Biplab Sikdar

Dynamic Resource Optimization for Energy-Efficient 6G-IoT Ecosystems , James Adu Ansere, Mohsin Kamal, Izaz Ahmad Khan, and Muhammad Naveed Aman

Towards Modeling Human Attention from Eye Movements for Neural Source Code Summarization , AAKASH BANSAL, Bonita Sharif, and COLLIN MCMILLAN

On Approximating Total Variation Distance , Arnab Bhattacharyya, Sutanu Gayen, Kuldeep S. Meel, Dimitrios Myrisiotis, A. Pavan, and N. V. Vinodchandran

Rapid: Region-Based Pointer Disambiguation , KHUSHBOO CHITRE, PIYUS KEDIA, and RAHUL PURANDARE

Dynamic Field Programmable Logic-Driven Soft Exosuit , Frances Cleary, Witawas Srisa-an, David C. Henshall, and Sasitharan Balasubramaniam

Convolutional Neural Networks Analysis Reveals Three Possible Sources of Bronze Age Writings between Greece and India , Shruti Daggumati and Peter Z. Revesz

EmergeNet: A novel deep-learning based ensemble segmentation model for emergence timing detection of coleoptile , Aankit Das, Sruti Das Choudhury, Amit Kumar Das, Ashok Samal, and Tala Awada

Network Slicing via Transfer Learning aided Distributed Deep Reinforcement Learning , Tianlun Hu, Qi Liao, Qiang Liu, and Georg Carle

Relative Comparison of Modern Computing to Computer Technology of Ages , Iwasan D. Kejawa Dr. and Hailly Rubio Ms.

Conversion of fat to cellular fuel—Fatty acids 𝛽-oxidation model , Sylwester M. Kloska, Krzysztof Pałczyński, Tomasz Marciniak, Tomasz Talaśka, Marissa Miller, Beata J. Wysocki, Paul Davis, and Tadeusz A. Wysocki

AgRIS: wind-adaptive wideband reconfigurable intelligent surfaces for resilient wireless agricultural networks at millimeter-wave spectrum , Shuai Nie and M. C. Vuran

Perceptual cue-guided adaptive image downscaling for enhanced semantic segmentation on large document images , Chulwoo Pack, Leen-Kiat Soh, and Elizabeth Lorang

Ethical Design of Computers: From Semiconductors to IoT and Artificial Intelligence , Sudeep Pasricha and Marilyn Wolf

OSC-CO 2 : coattention and cosegmentation framework for plant state change with multiple features , Rubi Quiñones, Ashok Samal, Sruti Das Choudhury, and Francisco Muñoz-Arriola

A Generalization of the Chomsky-Halle Phonetic Representation using Real Numbers for Robust Speech Recognition in Noisy Environments , Peter Z. Revesz

A Markovian Error Model for False Negatives in DNN-based Perception-Driven Control Systems , Kruttidipta Samal, Thomas Walton, Tran Hoang-Dung, and Marilyn Wolf

3DGAUnet: 3D Generative Adversarial Networks with a 3D U-Net Based Generator to Achieve the Accurate and Effective Synthesis of Clinical Tumor Image Data for Pancreatic Cancer , Yu Shi, Hannah Tang, Michael J. Baine, Michael A. Hollingsworth, Huijing Du, Dandan Zheng, Chi Zhang, and Hongfeng Yu

Revealing gene regulation-based neural network computing in bacteria , Samitha S. Somathilaka, Sasitharan Balasubramaniam, Daniel P. Martins, and Xu Li

Extending the breadth of saliva metabolome fngerprinting by smart template strategies and efective pattern realignment on comprehensive two‑dimensional gas chromatographic data , Simone Squara, Friederike Manig, Thomas Henle, Michael Hellwig, Andrea Caratti, Carlo Bicchi, Stephen E. Reichenbach, Qingping Tao, Massimo Collino, and Chiara Cordero

MFA-DVR: direct volume rendering of MFA models , Jianxin Sun, David Lenz, Hongfeng Yu, and Tom Peterka

Metamobility: Connecting Future Mobility with Metaverse , Haoxin Wang, Ziran Wang, Dawei Chen, Qiang Liu, Hongyu Ke, and Kyungtae Han

A Light-weight Technique to Detect GPS Spoofing using Attenuated Signal Envelopes , Xiao Wei, Muhammad Naveed Aman, and Biplab Sikdar

From Laboratory to Field: Unsupervised Domain Adaptation for Plant Disease Recognition in the Wild , Xinlu Wu, Xijian Fan, Peng Luo, Sruti Das Choudhury, Tardi Tjahjadi, and Chunhua Hu

Leaf-Counting in Monocot Plants Using Deep Regression Models , Xinyan Xie, Yufeng Ge, Harkamal Walia, Jinliang Yang, and Hongfeng Yu

Next-Generation Sequencing Data-Based Association Testing of a Group of Genetic Markers for Complex Responses Using a Generalized Linear Model Framework , Zheng Xu, Song Yan, Cong Wu, Qing Duan, Sixia Chen, and Yun Li

Efficient Two-Stage Analysis for Complex Trait Association with Arbitrary Depth Sequencing Data , Zheng Xu, Song Yan, Shuai Yuan, Cong Wu, Sixia Chen, and Zifang Guo

A Roadmap for the Human Gut Cell Atlas , Matthias Zilbauer, Kylie R. James, Mandeep Kaur, Sebastian Pott, Zhixin Li, Albert Burger, Jay R. Thiagarajah, Joseph Burclaff, Frode L. Jahnsen, Francesca Perrone, Alexander D. Ross, Gianluca Matteoli, Nathalie Stakenborg, Tomohisa Sujino, Andreas Moor, Raquel Bartolome-Casado, Espen S. Bækkevold, Ran Zhou, Bingqing Xie, Ken S. Lau, Shahida Din, Scott T. Magness, Qiuming Yao, Semir Beyaz, Mark Arends, Alexandre Denadai-Souza, Lori A. Coburn, Jellert T. Gaublomme, Richard Baldock, Irene Papatheodorou, Jose Ordovas-Montanes, Guy Boeckxstaens, Anna Hupalowska, and Sarah A. Teichmann

MR-PIPA: An Integrated Multi-level RRAM (HfOx) based Processing-In-Pixel Accelerator , Minhaz Abedin, Arman Roohi, Maximilian Liehr, Nathaniel Cady, and Shaahin Angizi

Enabling Intelligent IoTs for Histopathology Image Analysis Using Convolutional Neural Networks , Mohammed H. Alali, Arman Roohi, Shaahin Angizi, and Jitender S. Deogun

Realizing Molecular Machine Learning through Communications for Biological AI: Future Directions and Challenges , Sasitharan Balasubramaniam, Samitha Somathilaka, Sehee Sun, Adrian Ratwatte, and Massimiliano Pierobon

ubjective Information and Survival in a Simulated Biological System , Tyler S. Barker, Massimiliano Pierobon, and Peter J. Thomas

ICEBAR: Feedback-Driven Iterative Repair of Alloy Specifications , Simón Gutiérrez Brida, Germán Regis, Guolong Zheng, Hamid Bagheri, ThanhVu Nguyen, Nazareno Aguirre, and Marcelo Frias

Security, Trust and Privacy for Cloud, Fog and Internet of Things , Chien-Ming Chen, Shehzad Ashraf Chaudhry, Kuo-Hui Yeh, and Muhammad Naveed Aman

The Road Not Taken: Exploring Alias Analysis Based Optimizations Missed by the Compiler , KHUSHBOO CHITRE, PIYUS KEDIA, and RAHUL PURANDARE

Pitfalls and Guidelines for Using Time-Based Git Data , Samuel W. Flint, Jigyasa Chauhan, and Robert Dyer

Quasi-Spherical Absorbing Receiver Model of Glioblastoma Cells For Exosome-based Molecular Communications , Caio Fonseca, Michael Taynan Barros, Andreani Odysseos, Srivatsan Kidambi, and Sasitharan Balasubramaniam

Incoherent and Online Dictionary Learning Algorithm for Motion Prediction , Farrukh Hafeez, Usman Ullah Sheikh, Asif Iqbal, and Muhammad Naveed Aman

Inter-Cell Slicing Resource Partitioning via Coordinated Multi-Agent Deep Reinforcement Learning , Tianlun Hu, Qi Liao, Qiang Liu, Dan Wellington, and Georg Carle

Using deep learning to detect digitally encoded DNA trigger for Trojan malware in Bio‑Cyber attacks , M. S. Islam, S. Ivanov, H. Awan, J. Drohan, Sasitharan Balasubramaniam, L. Coffey, Srivatsan Kidambi, and W. Sri-saan

Decision-Theoretic Planning with Communication in Open Multiagent Systems , Anirudh Kakarlapudi, Gayathri Anil, Adam Eck, Prashant Doshi, and Leen-Kiat Soh

Society Dilemma of Computer Technology Management in Today's World , Iwasan D. Kejawa Ed.D

Deep Reinforcement Learning for End-to-End Network Slicing: Challenges and Solutions , Qiang Liu, Nakjung Choi, and Tao Han

Real-Time Dynamic Map with Crowdsourcing Vehicles in Edge Computing , Qiang Liu, Tao Han, Jiang (Linda) Xie, and BaekGyu Kim

Internal Model Control (IMC)-Based Active and Reactive Power Control of Brushless Double-Fed Induction Generator with Notch Filter , Ahsanullah Memon, Mohd Wazir Bin Mustafa, Zohaib Hussain Laghari, Touqeer Ahmed Jumani, Waqas Anjum, Shafi Ullah, and Muhammad Naveed Aman

Systems-Based Approach for Optimization of Assembly-Free Bacterial MLST Mapping , Natasha Pavlovikj, Joao Carlos Gomes-Neto, Jitender Deogun, and Andrew Benson

Room-temperature polariton quantum fluids in halide perovskites , Kai Peng, Renjie Tao, Louis Haeberlé, Quanwei Li, Dafei Jin, Graham R. Fleming, Stéphane Kéna-Cohen, Xiang Zhang, and Wei Bao

What Makes the Article “Condition Monitoring and Fault Diagnosis of Electrical Motors—A Review” So Popular? , Wei Qiao

Decipherment Challenges Due to Tamga and Letter Mix-Ups in an Old Hungarian Runic Inscription from the Altai Mountains , Peter Revesz

Profiling a Community-Specific Function Landscape for Bacterial Peptides Through Protein-Level Meta-Assembly and Machine Learning , Mitra Vajjala, Brady Johnson, Lauren Kasparek, Michael Leuze, and Qiuming Yao

Nanomechanical Resonators: Toward Atomic Scale , Bo Xu, Pengcheng Zhang, Jiankai Zhu, Zuheng Liu, Alexander Eichler, Xu-Qian Zheng, Jaesung Lee, Aneesh Dash, Swapnil More, Song Wu, Yanan Wang, Hao Jia, Akshay Naik, Adrian Bachtold, Rui Yang, Philip X.-L. Feng, and Zenghui Wang

Neural Network Repair with Reachability Analysis , Xiaodong Yang, Tom Yamaguchi, Tran Hoang-Dung, Bardh Hoxha, Taylor T. Johnson, and Danil Prokhorov

High throughput analysis of leaf chlorophyll content in sorghum using RGB, hyperspectral, and fluorescence imaging and sensor fusion , Huichun Zhang; Yufeng Ge; Xinyan Xie; Abbas Atefi; Nuwan Wijewardane,; and Suresh Thapa

Rethinking Sampled-Data Control for Unmanned Aircraft Systems , Xinkai Zhang and Justin M. Bradley

Deja Vu: semantics-aware recording and replay of high-speed eye tracking and interaction data to support cognitive studies of software engineering tasks—methodology and analyses , Vlas Zyrianov, Cole S. Peterson, Drew T. Guarnera, Joshua Behler, Praxis Weston, Bonita Sharif Ph.D., and Jonathan I. Maletic

Visual Growth Tracking for Automated Leaf Stage Monitoring Based on Image Sequence Analysis , Srinidhi Bashyam, Sruti Das Choudhury, Ashok Samal, and Tala Awada

Aerial Flight Paths for Communication , Alisha Bevins and Brittany Duncan

Facility Location Games with Ordinal Preferences , Hau Chan, Minming Li, and Chenhao Wang

Fingerlings mass estimation: A comparison between deep and shallow learning algorithms , Adair da Silva Oliveira Junior, Diego André Sant’Ana, Marcio Carneiro Brito Pache, Vanir Garcia, Vanessa Aparecida de Moares Weber, Gilberto Astolfi, Fabricio de Lima Weber, Geazy Vilharva Menezes, Gabriel Kirsten Menezes, Pedro Lucas França Albuquerque, Celso Soares Costa, Eduardo Quirino Arguelho de Queiroz, João Victor Araújo Rozales, Milena Wolff Ferreira, Marco Hiroshi Naka, and Hemerson Pistori

FIRE SUPPRESSION AND IGNITION WITH UNMANNED AERIAL VEHICLES , Carrick Detweiler, Sebastian Elbaum, James Higgins, Christian Laney, Craig Allen, Dirac L. Twidwell Jr, and Evan Michale Beachly

HyperSeed: An End-to-End Method to Process Hyperspectral Images of Seeds , Tian Gao, Anil Kumar Nalini Chandran, Puneet Paul, Harkamal Walia, and Hongfeng Yu

Novel 3D Imaging Systems for High-Throughput Phenotyping of Plants , Tian Gao, Feiyu Zhu, Puneet Paul, Jaspreet Sandhu, Henry Akrofi Doku, Jianxin Sun, Yu Pan, Paul Staswick, Harkamal Walia, and Hongfeng Yu

Human body-fluid proteome: quantitative profiling and computational prediction , Lan Huang, Dan Shao, Yan Wang, Xueteng Cui, Yufei Li, Qian Chen, and Juan Cui

University of Nebraska unmanned aerial system (UAS) profiling during the LAPSE-RATE field campaign , Ashraful Islam, Ajay Shankar, Adam Houston, and Carrick Detweiler

The Integral of Education Technology in the Society , Prof. Iwasan D. Kejawa Ed.D

Optimal Container Migration for Mobile Edge Computing: Algorithm, System Design and Implementation , Taewoon Kim, Motassem Al-Tarazi, Jenn-Wei Lin, and Wooyeol Choi

Microfluidic-based Bacterial Molecular Computing on a Chip , Daniel P. Martins; Michael Taynnan Barros; Benjamin O'Sullivan; Ian Seymour; Alan O'Riordan; Lee Coffey; Joseph Sweeney; and Sasitharan Balasubramaniam,

A Task-Driven Feedback Imager with Uncertainty Driven Hybrid Control , Burhan A. Mudassar, Priyabrata Saha, Marilyn Wolf, and Saibal Mukhopadhyay

Model Counting meets F 0 Estimation , A. PAVAN, N. V. VINODCHANDRAN, ARNAB BHATTACHARYYA, and KULDEEP S. MEEL

Multi-feature data repository development and analytics for image cosegmentation in high-throughput plant phenotyping , Rubi Quiñones, Francisco Munoz-Arriola, Sruti Das Choudhury, and Ashok Samal

A tiling algorithm-based string similarity measure , Peter Revesz

Combined Untargeted and Targeted Fingerprinting by Comprehensive Two-Dimensional Gas Chromatography to Track Compositional Changes on Hazelnut Primary Metabolome during Roasting , Marta Cialiè Rosso, Federico Stilo, Carlo Bicchi, Melanie Charron, Ginevra Rosso, Roberto Menta, Stephen Reichenbach, Christoph H. Weinert, Carina I. Mack, Sabine E. Kulling, and Chiara Cordero

DeepSec: a deep learning framework for secreted protein discovery in human body fluids , Dan Shao, Lan Huang, Yan Wang, Kai He, Xueteng Cui, Yao Wang, Qin Ma, and Juan Cui

A study of the generalizability of self-supervised representations , Atharva Tendle and Mohammad Rashedul Hasan

PhenoImage : An open-source graphical user interface for plant image analysis , Feiyu Zhu, Manny Saluja, Jaspinder Singh, Puneet Paul, Scott E. Sattler, Paul Staswick, Harkamal Walia, and Hongfeng Yu

SYSTEMSANDMETHODSFORREDUCING THE ACTUATION VOLTAGE FOR ELECTROSTATIC MEMS DEVICES , Fadi M. Alsaleem and Mohammad H. Hasan

Comparative evaluation of machine learning models for groundwater quality assessment , Shine Bedi, Ashok Samal, Chittaranjan Ray, and Daniel D. Snow

Validating a CS Attitudes Instrument , Ryan Bockmon, Stephen Cooper, Jonathan Gratch, and Mohsen Dorodchi

A CS1 Spatial Skills Intervention and the Impact on Introductory Programming Abilities , Ryan Bockmon, Stephen Cooper, William Koperski, Jonathan Gratch, Sheryl Sorby, and Mohsen Dorodchi

Elucidation of molecular links between obesity and cancer through microRNA regulation , Haluk Dogan, Jiang Shu, Zeynep M. Hakguder, Zheng Xu, and Juan Cui

Variability in the Effectiveness of Psychological Interventions based on Machine Learning in STEM Education , Mohammad Hasan and Bilal Khan

Resource Allocation and QoS Guarantees for Real World IP Traffic in Integrated XG-PON and IEEE802.11e EDCA Networks , Ravneet Kaur, Akshita Gupta, Anand Srivastava, Bijoy Chand Chatterjee, Abhijit Mitra, Byrav Ramamurthy, and Vivek Ashok Bohara

Global Technology Economic Analysis Paradigm , Iwasan D. Kejawa Ed.D

THE EFFECTS OF COMPUTER AND INFORMATION TECHNOLOGY ON EDUCATION , Iwasan D. Kejawa Ed.D

Development and Validation of the Computational Thinking Concepts and Skills Test , Markeya S. Peteranetz, Patrick M. Morrow, and Leen-Kiat Soh

A Multi-level Analysis of the Relationship between Instructional Practices and Retention in Computer Science , Markeya S. Peteranetz and Leen-Kiat Soh

ELECTROMAGNETIC POWER CONVERTER , Wei Qiao, Liyan Qu, and Haosen Wang

Estimating the maximum rise in temperature according to climate models using abstract interpretation , Peter Revesz and Robert J. Woodward

Paired Trial Classification: A Novel Deep Learning Technique for MVPA , Jacob M. Williams; Ashok Samal; Prahalada K. Rao; and Matthew R, Johnson

Platinum: Reusing Constraint Solutions in Bounded Analysis of Relational Logic , Guolong Zheng, Hamid Bagheri, Gregg Rothermel, and Jianghao Wang

Microbiome-Gut-Brain Axis as a Biomolecular Communication Network for the Internet of Bio-NanoThings , Ian F. Akyildiz, Jiande Chen, Maysam Ghovanloo, Ulkuhan Guler, Tevhide Ozkaya-Ahmadov, Massimiliano Pierobon, A Faith Sarioglu, and Bige D. Unluturk

Intercomparison of Small Unmanned Aircraft System (sUAS) Measurements for Atmospheric Science during the LAPSE-RATE Campaign , Lindsay Barbieri, Stephan T. Kral, Sean C. C. Bailey, Amy E. Frazier, Jamey D. Jacob, Joachim Reuder, David Brus, Phillip B. Chilson, Christopher Crick, Carrick Detweiler, Abhiram Doddi, Jack Elston, Hosein Foroutan, Javier Gonzalez-Rocha, Brian R. Greene, Marcelo I. Guzman, Adam L. Houston, Ashraful Islam, Osku Kemppinen, Dale Lawrence, Elizabeth A. Pillar-Little, Shane D. Ross, Michael Sama, David G. Schmale III, Travis J. Schuyler, Ajay Shankar, Suzanne W. Smith, Sean Waugh, Cory Dixon, Steve Borenstein, and Gijs de Boer

Advanced Search

Search Help

  • Notify me via email or RSS
  • Administrator Resources
  • How to Cite Items From This Repository
  • Copyright Information
  • Collections
  • Disciplines

Author Corner

  • Guide to Submitting
  • Submit your paper or article
  • Computer Science Website

Home | About | FAQ | My Account | Accessibility Statement

Privacy Copyright

  • IEEE CS Standards
  • Career Center
  • Subscribe to Newsletter
  • IEEE Standards

computer engineering research paper topics

  • For Industry Professionals
  • For Students
  • Launch a New Career
  • Membership FAQ
  • Membership FAQs
  • Membership Grades
  • Special Circumstances
  • Discounts & Payments
  • Distinguished Contributor Recognition
  • Grant Programs
  • Find a Local Chapter
  • Find a Distinguished Visitor
  • Find a Speaker on Early Career Topics
  • Technical Communities
  • Collabratec (Discussion Forum)
  • Start a Chapter
  • My Subscriptions
  • My Referrals
  • Computer Magazine
  • ComputingEdge Magazine
  • Let us help make your event a success. EXPLORE PLANNING SERVICES
  • Events Calendar
  • Calls for Papers
  • Conference Proceedings
  • Conference Highlights
  • Top 2024 Conferences
  • Conference Sponsorship Options
  • Conference Planning Services
  • Conference Organizer Resources
  • Virtual Conference Guide
  • Get a Quote
  • CPS Dashboard
  • CPS Author FAQ
  • CPS Organizer FAQ
  • Find the latest in advanced computing research. VISIT THE DIGITAL LIBRARY
  • Open Access

Tech News Blog

  • Author Guidelines
  • Reviewer Information
  • Guest Editor Information
  • Editor Information
  • Editor-in-Chief Information
  • Volunteer Opportunities
  • Video Library
  • Member Benefits
  • Institutional Library Subscriptions
  • Advertising and Sponsorship
  • Code of Ethics
  • Educational Webinars
  • Online Education
  • Certifications
  • Industry Webinars & Whitepapers
  • Research Reports
  • Bodies of Knowledge
  • CS for Industry Professionals

Resource Library

  • Newsletters
  • Women in Computing
  • Digital Library Access
  • Organize a Conference
  • Run a Publication
  • Become a Distinguished Speaker
  • Participate in Standards Activities
  • Peer Review Content
  • Author Resources
  • Publish Open Access
  • Society Leadership
  • Boards & Committees
  • Local Chapters
  • Governance Resources
  • Conference Publishing Services
  • Chapter Resources
  • About the Board of Governors
  • Board of Governors Members
  • Diversity & Inclusion
  • Open Volunteer Opportunities
  • Award Recipients
  • Student Scholarships & Awards
  • Nominate an Election Candidate
  • Nominate a Colleague
  • Corporate Partnerships
  • Conference Sponsorships & Exhibits
  • Advertising
  • Recruitment
  • Publications
  • Education & Career

Discover IEEE Computer Society Publications

Unlock peer-reviewed research and expert commentary from the world’s trusted resource for computer science and engineering information., peer-reviewed magazines & journals.

We are the home to prestigious publications that deliver insights from the brightest minds in computing.

computer engineering research paper topics

Full Range of Topics

computer engineering research paper topics

High Impact Factors

computer engineering research paper topics

Award-Winning Special Issues

computer engineering research paper topics

Digital Library with 840,000 Articles

Publications by Topic

  • IEEE Open Journal of the Computer Society
  • IEEE Transactions on Computers
  • IEEE Intelligent Systems
  • IEEE Transactions on Pattern Analysis and Machine Intelligence
  • IEEE/ACM Transactions on Computational Biology and Bioinformatics
  • IEEE Transactions on Emerging Topics in Computing
  • IEEE Computer Graphics and Applications
  • IEEE MultiMedia
  • IEEE Transactions on Visualization and Computer Graphics
  • IEEE Computer Architecture Letters
  • IEEE Annals of the History of Computing
  • IEEE Transactions on Affective Computing
  • IT Professional
  • IEEE Internet Computing
  • IEEE Transactions on Big Data
  • IEEE Transactions on Cloud Computing
  • IEEE Transactions on Knowledge and Data Engineering
  • IEEE Transactions on Services Computing
  • IEEE Pervasive Computing
  • IEEE Transactions on Mobile Computing
  • IEEE Transactions on Parallel and Distributed Systems
  • Computing in Science & Engineering
  • IEEE Security & Privacy
  • IEEE Transactions on Dependable and Secure Computing
  • IEEE Transactions on Privacy
  • IEEE Software
  • IEEE Transactions on Software Engineering
  • IEEE Transactions on Sustainable Computing

computer engineering research paper topics

Impact Factors

Impact factor (IF) measures how often a publication’s articles are cited and indicates its influence within a scientific community. IFs are reported by Clarivate Analytics Journal Citation Reports .

  • IEEE Transactions on Pattern Analysis and Machine Intelligence earned a 2022 IF of 23.6 —one of the highest of all artificial intelligence journals.
  • Eleven Computer Society journals hold the coveted top IF ranking in their specialty field.

Recent Awards

  • 2022 Mahoney Prize - IEEE Annals of the History of Computing , "Computing Capitalisms"
  • 2021 APEX Award for Publication Excellence - IEEE Security & Privacy , "The Future of Cybersecurity Policy"
  • Computer , "Technology Predictions"
  • IEEE Security & Privacy , "Smart Cities: Requirements for Security, Privacy, and Trust"
  • IEEE Software , "The Diversity Crisis in Software Development"

computer engineering research paper topics

Computer Society Digital Library

All our magazines, journals, and conference proceedings can be found in the Computer Society Digital Library (CSDL) and the IEEE Xplore ® digital library.

Many universities and institutions already have a subscription. Contact your librarian for details.

Individuals can access the CSDL at a discounted rate with IEEE Computer Society Membership . All Student members receive full access to the CSDL at no extra cost. Professional members receive 18 article downloads and can add full CSDL access for one flat rate using promo code CSDLTRACK . Professional members also have the option of subscribing to one or more publications within the CSDL.

  • Individual Subscriptions
  • Institutional Subscriptions
"Being the largest and most comprehensive collection of computer science resources available, the Computer Society Digital Library is a beacon of hope for academic libraries...”

— Manayer Ali Ahmed Naseeb, Director, Ahlia University

  • View Calls for Papers
  • Read Author Guidelines
  • 8 Things Authors Should Know before Publishing
  • Common Writing and Publishing Mistakes
  • Publish Safely with Open Access Journals
  • IEEE DataPort (Free Subscription for Members)

Peer Review Volunteer Resources

  • Reviewer Resources
  • Editor Resources
  • Guest Editor Resources
  • Editor-in-Chief Resources

computer engineering research paper topics

Open Access Research

Our first Gold Open Access (OA) journal, the IEEE Open Journal of the Computer Society (OJ-CS) and our second, IEEE Transactions on Privacy , are dedicated to publishing high-impact articles on emerging topics and trends in all aspects of computing and privacy, respectfully. Both publications provide a rapid review cycle for authors looking to publish their research and are fully compliant with funder mandates, including Plan S. OJ-CS and TP content are available for free in the IEEE Computer Society Digital Library (CSDL) and the IEEE Xplore ® digital library.

All our publications offer authors the opportunity to publish OA. Learn about hybrid publications.

Thank You to Our Volunteers!

computer engineering research paper topics

Our publications are led by computing professionals from around the world.

View Volunteering Opportunities

More News & Research

Computingedge newsletter.

Access insightful content from 12 magazines, all in one FREE monthly subscription available to both members and non-members.

Colloquium Abstracts

Explore a sampling of recently published abstracts from our journals, offered as a complimentary benefit for periodical subscribers.

Read expert commentary and analysis on today’s cutting-edge advances in computer technology in a freely available online format.

Find career guides, technology predictions, and high-level summaries of the latest developments and discoveries in computing.

  • Name * First Last
  • Country/Region * Country/Region Afghanistan Albania Algeria Andorra Angola Anguilla Antigua and Barbuda Argentina Armenia Aruba Australia Austria Azerbaijan Bahamas Bahrain Bangladesh Barbados Belarus Belgium Belize Benin Bermuda Bhutan Bolivia Bonaire, Sint Eustatius, Saba Bosnia and Herzegovina Botswana Brazil Brunei Darussalam Bulgaria Burkina Faso Burundi Cabo Verde Cambodia Cameroon Canada Cayman Islands Central African Republic Chad Chile China Colombia Comoros Congo Congo, Democratic Republic of Cook Islands Costa Rica Cote d'Ivoire Croatia Cuba Curacao Cyprus Czech Republic Denmark Djibouti Dominica Dominican Republic Ecuador Egypt El Salvador Equatorial Guinea Eritrea Estonia Eswatini Ethiopia Falkland Islands (Malvinas) Faroe Islands Fiji Finland France French Guiana French Polynesia Gabon Gambia Georgia Germany Ghana Gibraltar Greece Greenland Grenada Guadeloupe Guatemala Guinea Guinea-Bissau Guyana Haiti Honduras Hong Kong Hungary Iceland India Indonesia Iran, Islamic Republic of Iraq Ireland Isle of Man Israel Italy Jamaica Japan Jordan Kazakhstan Kenya Kiribati Korea (North) Korea, Republic of Kosovo Kuwait Kyrgyzstan Laos Latvia Lebanon Lesotho Liberia Libya Liechtenstein Lithuania Luxembourg Macao Madagascar Malawi Malaysia Maldives Mali Malta Martinique Mauritania Mauritius Mayotte Mexico Moldova, Republic of Monaco Mongolia Montenegro Montserrat Morocco Mozambique Myanmar Namibia Nauru Nepal Netherlands New Caledonia New Zealand Nicaragua Niger Nigeria Niue Norfolk Island North Macedonia Norway Oman Pakistan Palestine, State of Panama Papua New Guinea Paraguay Peru Philippines Pitcairn Poland Portugal Qatar Reunion Romania Russian Federation Rwanda Saint Kitts and Nevis Saint Lucia Samoa San Marino Sao Tome and Principe Saudi Arabia Senegal Serbia Seychelles Sierra Leone Singapore Sint Maarten Slovakia Slovenia Solomon Islands Somalia South Africa South Sudan Spain Sri Lanka St. Helena St. Vincent and the Grenadines Sudan Suriname Svalbard and Jan Mayen Sweden Switzerland Syrian Arab Republic Taiwan Tajikistan Tanzania, United Republic of Thailand Timor-Leste Togo Tokelau Tonga Trinidad and Tobago Tunisia Turkey Turkmenistan Turks And Caicos Islands Tuvalu Uganda Ukraine United Arab Emirates United Kingdom Uruguay USA Uzbekistan Vatican City Venezuela Vietnam Virgin Islands (British) Wallis And Futuna Western Sahara Yemen Zambia Zimbabwe
  • I agree to the IEEE Privacy Policy .*

IEEE Computer Society logo

Sign up for our newsletter.

EMAIL ADDRESS

computer engineering research paper topics

IEEE COMPUTER SOCIETY

  • Board of Governors
  • IEEE Support Center

DIGITAL LIBRARY

  • Librarian Resources

COMPUTING RESOURCES

  • Courses & Certifications

COMMUNITY RESOURCES

  • Conference Organizers
  • Communities

BUSINESS SOLUTIONS

  • Conference Sponsorships & Exhibits
  • Digital Library Institutional Subscriptions
  • Accessibility Statement
  • IEEE Nondiscrimination Policy
  • XML Sitemap

©IEEE — All rights reserved. Use of this website signifies your agreement to the IEEE Terms and Conditions.

A not-for-profit organization, the Institute of Electrical and Electronics Engineers (IEEE) is the world's largest technical professional organization dedicated to advancing technology for the benefit of humanity.

Open research in computer science

New Content Item

Spanning networks and communications to security and cryptology to big data, complexity, and analytics, SpringerOpen and BMC publish one of the leading open access portfolios in computer science. Learn about our journals and the research we publish here on this page. 

Highly-cited recent articles

Spotlight on.

New Content Item

EPJ Data Science

See how EPJ Data Science  brings attention to data science 

New Content Item

Reasons to publish in Human-centric Computing and Information Sciences

Download this handy infographic to see all the reasons why Human-centric Computing and Information Sciences is a great place to publish. 

We've asked a few of our authors about their experience of publishing with us.

What authors say about publishing in our journals:

Fast, transparent, and fair.  - EPJ Data Science Easy submission process through online portal. - Journal of Cloud Computing Patient support and constant reminder at every phase. - Journal of Cloud Computing Quick and relevant. - Journal of Big Data ​​​​​​​

How to Submit Your Manuscript

Your browser needs to have JavaScript enabled to view this video

Computer science blog posts

Springer Open Blog

Read the latest from the SpringerOpen blog

The SpringerOpen blog highlights recent noteworthy research of general interest published in our open access journals. 

Failed to load RSS feed.

Home

Search form

Review of computer engineering studies.

computer engineering research paper topics

  • ISSN:  2369-0755 (print); 2369-0763 (online)
  • Indexing & Archiving:  EBSCOhost Cabell's Directory Publons ScienceOpen Google Scholar Index Copernicus CrossRef Portico Microsoft Academic CNKI Scholar Baidu Scholar
  • Subject:  Computer Sciences Engineering

computer engineering research paper topics

Review of Computer Engineering Studies (RCES) is an international, scholarly and peer-reviewed journal dedicated to providing scientists, engineers and technicians with the latest developments on computer science. The journal offers a window into the most recent discoveries in four categories, namely, computing (computing theory, scientific computing, cloud computing, high-performance computing, numerical and symbolic computing), system (database systems, real-time systems, operating systems, warning systems, decision support systems, information processes and systems, systems integration), intelligence (robotics, bio-informatics, web intelligence, artificial intelligence), and application (security, networking, software engineering, pattern recognition, e-science and e-commerce, signal and image processing, control theory and applications). The editorial board welcomes original, substantiated papers on the above topics, which could provoke new science interest and benefit those devoted to computer science. The journal is published quarterly by the IIETA.

The RCES is an open access journal. All contents are available for free, that is, users are entitled to read, download, duplicate, distribute, print, search or link to the full texts of the articles in this journal without prior consent from the publisher or the author.

Focus and Scope

The RCES welcomes original research papers, technical notes and review articles on the following disciplines:

  • Computing theory
  • Scientific computing
  • Cloud computing
  • High-performance computing
  • Numerical and symbolic computing
  • Database systems
  • Real-time systems
  • Operating systems
  • Warning systems
  • Decision support systems
  • Information processes and systems
  • Systems integration
  • Bio-informatics
  • Web intelligence
  • Artificial intelligence
  • Software engineering
  • Pattern recognition
  • E-science and e-commerce
  • Signal and image processing
  • Control theory and applications

Publication Frequency

The RCES is published quarterly by the IIETA, with four regular issues (excluding special issues) and one volume per year.

Peer Review Statement

The IIETA adopts a double blind review process. Once submitted, a paper dealing with suitable topics will be sent to the editor-in-chief or managing editor, and then be reviewed by at least two experts in the relevant field. The reviewers are either members of our editorial board or special external experts invited by the journal. In light of the reviewers’ comments, the editor-in-chief or managing editor will make the final decision over the publication, and return the decision to the author.

There are four possible decisions concerning the paper: acceptance, minor revision, major revision and rejection. Acceptances means the paper will be published directly without any revision. Minor revision means the author should make minor changes to the manuscript according to reviewers’ comments and submit the revised version to the IIETA. The revised version will be accepted or rejected at the discretion of the editor-in-chief or managing editor. Major revision means the author should modify the manuscript significantly according to reviewers’ comments and submit the revised version to the IIETA. The revised version will be accepted or rejected at the discretion of the editor-in-chief or managing editor. Rejection means the submitted paper will not be published.

If a paper is accepted, the editor-in-chief or managing editor will send an acceptance letter to the author, and ask the author to prepare the paper in MS Word using the  template  of IIETA. 

Plagiarism Policy

Plagiarism is committed when one author uses another work without permission, credit, or acknowledgment. Plagiarism takes different forms, from literal copying to paraphrasing the work of another. The IIETA uses CrossRef to screen for unoriginal material. Authors submitting to an IIETA journal should be aware that their paper may be submitted to CrossRef at any point during the peer-review or production process. Any allegations of plagiarism made to a journal will be investigated by the editor-in-chief or managing editor. If the allegations appear to be founded, we will request all named authors of the paper to give an explanation of the overlapping material. If the explanation is not satisfactory, we will reject the submission, and may also reject future submissions.

For instructions on citing any of IIETA’s journals as well as our  Ethics Statement , see  Policies and Standards .

Indexing Information

Web:  https://www.portico.org/

  •  Index Copernicus 

Web:  http://journals.indexcopernicus.com

  •  Google Scholar 

Web:  http://scholar.google.com

  •  CNKI Scholar 

Web:  http://scholar.cnki.net

Included in

  •  Crossref.org 

Web:  https://www.crossref.org/

ISSN: 2369-0755 (print); 2369-0763 (online)

For Submission Inquiry

Email: [email protected]

computer engineering research paper topics

  • Just Published
  • Featured Articles
  • Most Read and Cited
  • Winters’ Multiplicative Model Based Analysis of the Development and Prospects of New Energy Electric Vehicles in China Zhishuo Jin, Yubing Qian
  • Agent-based Analysis and Simulation of Online Shopping Behavior in the Context of Online Promotion Xiaoyi Deng
  • Study of Two Kinds of Analysis Methods of Intrusion Tolerance System State Transition Model Zhiyong Luo, Xu Yang, Guanglu Sun, Zhiqiang Xie
  • A Review on Automated Billing for Smart Shopping System Using IOT Priyanka S. Sahare, Anup Gade, Jayant Rohankar
  • A Review on Automated Billing for Smart Shopping System Using IOT Priyanka S. Sahare, Anup Gade, Jayant Rohank
  • Algorithm Research on the Analysis of College Student Score Jinxin Ma, Limin Cui

Phone: + 1 825 436 9306

Email: [email protected]

Subscription

Language support

Please sign up to receive notifications on new issues and newsletters from IIETA

Select Journal/Journals:

Copyright © 2024 IIETA. All Rights Reserved.

California State University, San Bernardino

Home > College of Natural Sciences > COMPUTERSCI-ENGINEERING > COMPUTERSCI-ENGINEERING-ETD

Computer Science and Engineering Theses, Projects, and Dissertations

Theses/projects/dissertations from 2024 2024.

A SMART HYBRID ENHANCED RECOMMENDATION AND PERSONALIZATION ALGORITHM USING MACHINE LEARNING , Aswin Kumar Nalluri

Theses/Projects/Dissertations from 2023 2023

CLASSIFICATION OF LARGE SCALE FISH DATASET BY DEEP NEURAL NETWORKS , Priyanka Adapa

GEOSPATIAL WILDFIRE RISK PREDICTION USING DEEP LEARNING , Abner Alberto Benavides

HUMAN SUSPICIOUS ACTIVITY DETECTION , Nilamben Bhuva

MAX FIT EVENT MANAGEMENT WITH SALESFORCE , AKSHAY DAGWAR

MELANOMA DETECTION BASED ON DEEP LEARNING NETWORKS , Sanjay Devaraneni

Heart Disease Prediction Using Binary Classification , Virendra Sunil Devare

CLASSIFICATION OF THORAX DISEASES FROM CHEST X-RAY IMAGES , Sharad Jayusukhbhai Dobariya

WEB BASED MANAGEMENT SYSTEM FOR HOUSING SOCIETY , Likhitha Reddy Eddala

Sales and Stock Management System , Rashmika Gaddam Ms

CONTACTLESS FOOD ORDERING SYSTEM , Rishivar Kumar Goli

RESTAURANT MANAGEMENT WEBSITE , Akhil Sai Gollapudi

DISEASE OF LUNG INFECTION DETECTION USING CNN MODEL -BAYESIAN OPTIMIZATION , poojitha gutha

DATA POISONING ATTACKS ON PHASOR MEASUREMENT UNIT DATA , Rutuja Sanjeev Haridas

CRIME MAPPING ANALYSIS USING WEB APPLICATION. , Lavanya Krishnappa

A LONG-TERM FUNDS PREDICTOR BASED ON DEEP LEARNING , SHUIYI KUANG

LIVER SEGMENTATION AND LESION DETECTION IN MEDICAL IMAGES USING A DEEP LEARNING-BASED U-NET MODEL , Kaushik Mahida

PHASOR MEASUREMENT UNIT DATA VISUALIZATION , Nikhila Mandava

TWITTER POLICING , Hemanth Kumar Medisetty

TRANSACTION MANAGEMENT SYSYEM FOR A PUBLISHER , HASSAIN SHAREEF MOHAMMED JR

LOBANGU: AN OPTICAL CHARACTER RECOGNITION RECEIPT MANAGEMENT APP FOR HEALTH CENTER PHARMACIES IN THE D.R.CONGO AND SURROUNDING EASTERN AFRICAN COUNTRIES , Bénis Munganga

PREDICTIVE MODEL FOR CFPB CONSUMER COMPLAINTS , Vyshnavi Nalluri

REVIEW CLASSIFICATION USING NATURAL LANGUAGE PROCESSING AND DEEP LEARNING , Brian Nazareth

Brain Tumor Detection Using MRI Images , Mayur Patel

QUIZ WEB APPLICATION , Dipti Rathod

HYPOTHYROID DISEASE ANALYSIS BY USING MACHINE LEARNING , SANJANA SEELAM

Pillow Based Sleep Tracking Device Using Raspberry Pi , Venkatachalam Seviappan

FINSERV ANDROID APPLICATION , Harsh Piyushkumar Shah

AUTOMATED MEDICAL NOTES LABELLING AND CLASSIFICATION USING MACHINE LEARNING , Akhil Prabhakar Thota

GENETIC PROGRAMMING TO OPTIMIZE PERFORMANCE OF MACHINE LEARNING ALGORITHMS ON UNBALANCED DATA SET , Asitha Thumpati

GOVERNMENT AID PORTAL , Darshan Togadiya

GENERAL POPULATION PROJECTION MODEL WITH CENSUS POPULATION DATA , Takenori Tsuruga

LUNG LESION SEGMENTATION USING DEEP LEARNING APPROACHES , Sree Snigdha Tummala

DETECTION OF PHISHING WEBSITES USING MACHINE LEARNING , Saranya Valleri

Machine Learning for Kalman Filter Tuning Prediction in GPS/INS Trajectory Estimation , Peter Wright

Theses/Projects/Dissertations from 2022 2022

LEARN PROGRAMMING IN VIRTUAL REALITY? A PROJECT FOR COMPUTER SCIENCE STUDENTS , Benjamin Alexander

LUNG CANCER TYPE CLASSIFICATION , Mohit Ramajibhai Ankoliya

HIGH-RISK PREDICTION FOR COVID-19 PATIENTS USING MACHINE LEARNING , Raja Kajuluri

IMPROVING INDIA’S TRAFFIC MANAGEMENT USING INTELLIGENT TRANSPORTATION SYSTEMS , Umesh Makhloga

DETECTION OF EPILEPSY USING MACHINE LEARNING , Balamurugan Murugesan

SOCIAL MOBILE APPLICATION: UDROP , Mahmoud Oraiqat

Improved Sensor-Based Human Activity Recognition Via Hybrid Convolutional and Recurrent Neural Networks , Sonia Perez-Gamboa

College of Education FileMaker Extraction and End-User Database Development , Andrew Tran

DEEP LEARNING EDGE DETECTION IN IMAGE INPAINTING , Zheng Zheng

Theses/Projects/Dissertations from 2021 2021

A General Conversational Chatbot , Vipin Nambiar

Verification System , Paras Nigam

DESKTOP APPLICATION FOR THE PUZZLE BOARD GAME “RUSH HOUR” , Huanqing Nong

Ahmedabad City App , Rushabh Picha

COMPUTER SURVEILLANCE SYSTEM USING WI-FI FOR ANDROID , Shashank Reddy Saireddy

ANDROID PARKING SYSTEM , Vishesh Reddy Sripati

Sentiment Analysis: Stock Index Prediction with Multi-task Learning and Word Polarity Over Time , Yue Zhou

Theses/Projects/Dissertations from 2020 2020

BUBBLE-IN DIGITAL TESTING SYSTEM , Chaz Hampton

FEEDBACK REVIEW SYSTEM USING SENTIMENT ANALYSIS , Vineeth Kukkamalla

WEB APPLICATION FOR MOVIE PERFORMANCE PREDICTION , Devalkumar Patel

Theses/Projects/Dissertations from 2019 2019

REVIEWS TO RATING CONVERSION AND ANALYSIS USING MACHINE LEARNING TECHNIQUES , Charitha Chanamolu

EASY EXAM , SARTHAK DABHI

EXTRACT TRANSFORM AND LOADING TOOL FOR EMAIL , Amit Rajiv Lawanghare

VEHICLE INFORMATION SYSTEM USING BLOCKCHAIN , Amey Zulkanthiwar

Theses/Projects/Dissertations from 2018 2018

USING AUTOENCODER TO REDUCE THE LENGTH OF THE AUTISM DIAGNOSTIC OBSERVATION SCHEDULE (ADOS) , Sara Hussain Daghustani

California State University, San Bernardino Chatbot , Krutarth Desai

ORGANIZE EVENTS MOBILE APPLICATION , Thakshak Mani Chandra Reddy Gudimetla

SOCIAL NETWORK FOR SOFTWARE DEVELOPERS , Sanket Prabhakar Jadhav

VIRTUALIZED CLOUD PLATFORM MANAGEMENT USING A COMBINED NEURAL NETWORK AND WAVELET TRANSFORM STRATEGY , Chunyu Liu

INTER PROCESS COMMUNICATION BETWEEN TWO SERVERS USING MPICH , Nagabhavana Narla

SENSOR-BASED HUMAN ACTIVITY RECOGNITION USING BIDIRECTIONAL LSTM FOR CLOSELY RELATED ACTIVITIES , Arumugam Thendramil Pavai

NEURAL NETWORK ON VIRTUALIZATION SYSTEM, AS A WAY TO MANAGE FAILURE EVENTS OCCURRENCE ON CLOUD COMPUTING , Khoi Minh Pham

EPICCONFIGURATOR COMPUTER CONFIGURATOR AND CMS PLATFORM , IVO A. TANTAMANGO

STUDY ON THE PATTERN RECOGNITION ENHANCEMENT FOR MATRIX FACTORIZATIONS WITH AUTOMATIC RELEVANCE DETERMINATION , hau tao

Theses/Projects/Dissertations from 2017 2017

CHILDREN’S SOCIAL NETWORK: KIDS CLUB , Eiman Alrashoud

MULTI-WAY COMMUNICATION SYSTEM , S. Chinnam

WEB APPLICATION FOR GRADUATE COURSE RECOMMENDATION SYSTEM , Sayali Dhumal

MOBILE APPLICATION FOR ATTENDANCE SYSTEM COYOTE-ATTENDANCE , Sindhu Hari

WEB APPLICATION FOR GRADUATE COURSE ADVISING SYSTEM , Sanjay Karrolla

Custom T-Shirt Designs , Ranjan Khadka

STUDENT CLASS WAITING LIST ENROLLMENT , AISHWARYA LACHAGARI

ANDROID MOBILE APPLICATION FOR HOSPITAL EXECUTIVES , Vihitha Nalagatla

PIPPIN MACHINE , Kiran Reddy Pamulaparthy

SOUND MODE APPLICATION , Sindhuja Pogaku

I2MAPREDUCE: DATA MINING FOR BIG DATA , Vishnu Vardhan Reddy Sherikar

COMPARING AND IMPROVING FACIAL RECOGNITION METHOD , Brandon Luis Sierra

NATURAL LANGUAGE PROCESSING BASED GENERATOR OF TESTING INSTRUMENTS , Qianqian Wang

AUTOMATIC GENERATION OF WEB APPLICATIONS AND MANAGEMENT SYSTEM , Yu Zhou

Theses/Projects/Dissertations from 2016 2016

CLOTH - MODELING, DEFORMATION, AND SIMULATION , Thanh Ho

CoyoteLab - Linux Containers for Educational Use , Michael D. Korcha

PACKET FILTER APPROACH TO DETECT DENIAL OF SERVICE ATTACKS , Essa Yahya M Muharish

DATA MINING: TRACKING SUSPICIOUS LOGGING ACTIVITY USING HADOOP , Bir Apaar Singh Sodhi

Theses/Projects/Dissertations from 2015 2015

APPLY DATA CLUSTERING TO GENE EXPRESSION DATA , Abdullah Jameel Abualhamayl Mr.

Density Based Data Clustering , Rayan Albarakati

Developing Java Programs on Android Mobile Phones Using Speech Recognition , Santhrushna Gande

THE DESIGN AND IMPLEMENTATION OF AN ADAPTIVE CHESS GAME , Mehdi Peiravi

CALIFORNIA STATE UNIVERSITY SAN BERNARDINO WiN GPS , Francisco A. Ron

ESTIMATION ON GIBBS ENTROPY FOR AN ENSEMBLE , Lekhya Sai Sake

A WEB-BASED TEMPERATURE MONITORING SYSTEM FOR THE COLLEGE OF ARTS AND LETTERS , Rigoberto Solorio

ANTICS: A CROSS-PLATFORM MOBILE GAME , Gerren D. Willis

Theses/Projects/Dissertations from 2014 2014

Introducing Non-Determinism to the Parallel C Compiler , Rowen Concepcion

THE I: A CLIENT-BASED POINT-AND-CLICK PUZZLE GAME , Aldo Lewis

Interactive Student Planner Application , NII TETTEH TACKIE YARBOI

ANDROID MOBILE APPLICATION FOR CREST COMMUNITY CHURCH IN RIVERSIDE , Ran Wei

Proton Computed Tomography: Matrix Data Generation Through General Purpose Graphics Processing Unit Reconstruction , micah witt

Advanced Search

  • Notify me via email or RSS
  • Department, Program, or Office
  • Disciplines

Author Corner

  • School of Computer Science and Engineering Website

A service of the John M. Pfau Library

Digital Commons Network

Home | About | FAQ | My Account | Accessibility Statement

Privacy Copyright Acrobat Reader

Princeton University

  • Advisers & Contacts
  • Bachelor of Arts & Bachelor of Science in Engineering
  • Prerequisites
  • Declaring Computer Science for AB Students
  • Declaring Computer Science for BSE Students
  • Class of '25, '26 & '27 - Departmental Requirements
  • Class of 2024 - Departmental Requirements
  • COS126 Information
  • Important Steps and Deadlines
  • Independent Work Seminars
  • Guidelines and Useful Information

Undergraduate Research Topics

  • AB Junior Research Workshops
  • Undergraduate Program FAQ
  • How to Enroll
  • Requirements
  • Certificate Program FAQ
  • Interdepartmental Committee
  • Minor Program
  • Funding for Student Group Activities
  • Mailing Lists and Policies
  • Study Abroad
  • Jobs & Careers
  • Admissions Requirements
  • Breadth Requirements
  • Pre-FPO Checklist
  • FPO Checklist
  • M.S.E. Track
  • M.Eng. Track
  • Departmental Internship Policy (for Master's students)
  • General Examination
  • Fellowship Opportunities
  • Travel Reimbursement Policy
  • Communication Skills
  • Course Schedule
  • Course Catalog
  • Research Areas
  • Interdisciplinary Programs
  • Technical Reports
  • Computing Facilities
  • Researchers
  • Technical Staff
  • Administrative Staff
  • Graduate Students
  • Undergraduate Students
  • Graduate Alumni
  • Climate and Inclusion Committee
  • Resources for Undergraduate & Graduate Students
  • Outreach Initiatives
  • Resources for Faculty & Staff
  • Spotlight Stories
  • Job Openings
  • Undergraduate Program
  • Independent Work & Theses

Suggested Undergraduate Research Topics

computer engineering research paper topics

How to Contact Faculty for IW/Thesis Advising

Send the professor an e-mail. When you write a professor, be clear that you want a meeting regarding a senior thesis or one-on-one IW project, and briefly describe the topic or idea that you want to work on. Check the faculty listing for email addresses.

Parastoo Abtahi, Room 419

Available for single-semester IW and senior thesis advising, 2024-2025

  • Research Areas: Human-Computer Interaction (HCI), Augmented Reality (AR), and Spatial Computing
  • Input techniques for on-the-go interaction (e.g., eye-gaze, microgestures, voice) with a focus on uncertainty, disambiguation, and privacy.
  • Minimal and timely multisensory output (e.g., spatial audio, haptics) that enables users to attend to their physical environment and the people around them, instead of a 2D screen.
  • Interaction with intelligent systems (e.g., IoT, robots) situated in physical spaces with a focus on updating users’ mental model despite the complexity and dynamicity of these systems.

Ryan Adams, Room 411

Research areas:

  • Machine learning driven design
  • Generative models for structured discrete objects
  • Approximate inference in probabilistic models
  • Accelerating solutions to partial differential equations
  • Innovative uses of automatic differentiation
  • Modeling and optimizing 3d printing and CNC machining

Andrew Appel, Room 209

Available for Fall 2024 IW advising, only

  • Research Areas: Formal methods, programming languages, compilers, computer security.
  • Software verification (for which taking COS 326 / COS 510 is helpful preparation)
  • Game theory of poker or other games (for which COS 217 / 226 are helpful)
  • Computer game-playing programs (for which COS 217 / 226)
  •  Risk-limiting audits of elections (for which ORF 245 or other knowledge of probability is useful)

Sanjeev Arora, Room 407

  • Theoretical machine learning, deep learning and its analysis, natural language processing. My advisees would typically have taken a course in algorithms (COS423 or COS 521 or equivalent) and a course in machine learning.
  • Show that finding approximate solutions to NP-complete problems is also NP-complete (i.e., come up with NP-completeness reductions a la COS 487). 
  • Experimental Algorithms: Implementing and Evaluating Algorithms using existing software packages. 
  • Studying/designing provable algorithms for machine learning and implementions using packages like scipy and MATLAB, including applications in Natural language processing and deep learning.
  • Any topic in theoretical computer science.

David August, Room 221

Not available for IW or thesis advising, 2024-2025

  • Research Areas: Computer Architecture, Compilers, Parallelism
  • Containment-based approaches to security:  We have designed and tested a simple hardware+software containment mechanism that stops incorrect communication resulting from faults, bugs, or exploits from leaving the system.   Let's explore ways to use containment to solve real problems.  Expect to work with corporate security and technology decision-makers.
  • Parallelism: Studies show much more parallelism than is currently realized in compilers and architectures.  Let's find ways to realize this parallelism.
  • Any other interesting topic in computer architecture or compilers. 

Mark Braverman, 194 Nassau St., Room 231

  • Research Areas: computational complexity, algorithms, applied probability, computability over the real numbers, game theory and mechanism design, information theory.
  • Topics in computational and communication complexity.
  • Applications of information theory in complexity theory.
  • Algorithms for problems under real-life assumptions.
  • Game theory, network effects
  • Mechanism design (could be on a problem proposed by the student)

Sebastian Caldas, 221 Nassau Street, Room 105

  • Research Areas: collaborative learning, machine learning for healthcare. Typically, I will work with students that have taken COS324.
  • Methods for collaborative and continual learning.
  • Machine learning for healthcare applications.

Bernard Chazelle, 194 Nassau St., Room 301

  • Research Areas: Natural Algorithms, Computational Geometry, Sublinear Algorithms. 
  • Natural algorithms (flocking, swarming, social networks, etc).
  • Sublinear algorithms
  • Self-improving algorithms
  • Markov data structures

Danqi Chen, Room 412

  • My advisees would be expected to have taken a course in machine learning and ideally have taken COS484 or an NLP graduate seminar.
  • Representation learning for text and knowledge bases
  • Pre-training and transfer learning
  • Question answering and reading comprehension
  • Information extraction
  • Text summarization
  • Any other interesting topics related to natural language understanding/generation

Marcel Dall'Agnol, Corwin 034

  • Research Areas: Theoretical computer science. (Specifically, quantum computation, sublinear algorithms, complexity theory, interactive proofs and cryptography)
  • Research Areas: Machine learning

Jia Deng, Room 423

  •  Research Areas: Computer Vision, Machine Learning.
  • Object recognition and action recognition
  • Deep Learning, autoML, meta-learning
  • Geometric reasoning, logical reasoning

Adji Bousso Dieng, Room 406

  • Research areas: Vertaix is a research lab at Princeton University led by Professor Adji Bousso Dieng. We work at the intersection of artificial intelligence (AI) and the natural sciences. The models and algorithms we develop are motivated by problems in those domains and contribute to advancing methodological research in AI. We leverage tools in statistical machine learning and deep learning in developing methods for learning with the data, of various modalities, arising from the natural sciences.

Robert Dondero, Corwin Hall, Room 038

  • Research Areas:  Software engineering; software engineering education.
  • Develop or evaluate tools to facilitate student learning in undergraduate computer science courses at Princeton, and beyond.
  • In particular, can code critiquing tools help students learn about software quality?

Zeev Dvir, 194 Nassau St., Room 250

  • Research Areas: computational complexity, pseudo-randomness, coding theory and discrete mathematics.
  • Independent Research: I have various research problems related to Pseudorandomness, Coding theory, Complexity and Discrete mathematics - all of which require strong mathematical background. A project could also be based on writing a survey paper describing results from a few theory papers revolving around some particular subject.

Benjamin Eysenbach, Room 416

  • Research areas: reinforcement learning, machine learning. My advisees would typically have taken COS324.
  • Using RL algorithms to applications in science and engineering.
  • Emergent behavior of RL algorithms on high-fidelity robotic simulators.
  • Studying how architectures and representations can facilitate generalization.

Christiane Fellbaum, 1-S-14 Green

  • Research Areas: theoretical and computational linguistics, word sense disambiguation, lexical resource construction, English and multilingual WordNet(s), ontology
  • Anything having to do with natural language--come and see me with/for ideas suitable to your background and interests. Some topics students have worked on in the past:
  • Developing parsers, part-of-speech taggers, morphological analyzers for underrepresented languages (you don't have to know the language to develop such tools!)
  • Quantitative approaches to theoretical linguistics questions
  • Extensions and interfaces for WordNet (English and WN in other languages),
  • Applications of WordNet(s), including:
  • Foreign language tutoring systems,
  • Spelling correction software,
  • Word-finding/suggestion software for ordinary users and people with memory problems,
  • Machine Translation 
  • Sentiment and Opinion detection
  • Automatic reasoning and inferencing
  • Collaboration with professors in the social sciences and humanities ("Digital Humanities")

Adam Finkelstein, Room 424 

  • Research Areas: computer graphics, audio.

Robert S. Fish, Corwin Hall, Room 037

  • Networking and telecommunications
  • Learning, perception, and intelligence, artificial and otherwise;
  • Human-computer interaction and computer-supported cooperative work
  • Online education, especially in Computer Science Education
  • Topics in research and development innovation methodologies including standards, open-source, and entrepreneurship
  • Distributed autonomous organizations and related blockchain technologies

Michael Freedman, Room 308 

  • Research Areas: Distributed systems, security, networking
  • Projects related to streaming data analysis, datacenter systems and networks, untrusted cloud storage and applications. Please see my group website at http://sns.cs.princeton.edu/ for current research projects.

Ruth Fong, Room 032

  • Research Areas: computer vision, machine learning, deep learning, interpretability, explainable AI, fairness and bias in AI
  • Develop a technique for understanding AI models
  • Design a AI model that is interpretable by design
  • Build a paradigm for detecting and/or correcting failure points in an AI model
  • Analyze an existing AI model and/or dataset to better understand its failure points
  • Build a computer vision system for another domain (e.g., medical imaging, satellite data, etc.)
  • Develop a software package for explainable AI
  • Adapt explainable AI research to a consumer-facing problem

Note: I am happy to advise any project if there's a sufficient overlap in interest and/or expertise; please reach out via email to chat about project ideas.

Tom Griffiths, Room 405

Available for Fall 2024 single-semester IW advising, only

Research areas: computational cognitive science, computational social science, machine learning and artificial intelligence

Note: I am open to projects that apply ideas from computer science to understanding aspects of human cognition in a wide range of areas, from decision-making to cultural evolution and everything in between. For example, we have current projects analyzing chess game data and magic tricks, both of which give us clues about how human minds work. Students who have expertise or access to data related to games, magic, strategic sports like fencing, or other quantifiable domains of human behavior feel free to get in touch.

Aarti Gupta, Room 220

  • Research Areas: Formal methods, program analysis, logic decision procedures
  • Finding bugs in open source software using automatic verification tools
  • Software verification (program analysis, model checking, test generation)
  • Decision procedures for logical reasoning (SAT solvers, SMT solvers)

Elad Hazan, Room 409  

  • Research interests: machine learning methods and algorithms, efficient methods for mathematical optimization, regret minimization in games, reinforcement learning, control theory and practice
  • Machine learning, efficient methods for mathematical optimization, statistical and computational learning theory, regret minimization in games.
  • Implementation and algorithm engineering for control, reinforcement learning and robotics
  • Implementation and algorithm engineering for time series prediction

Felix Heide, Room 410

  • Research Areas: Computational Imaging, Computer Vision, Machine Learning (focus on Optimization and Approximate Inference).
  • Optical Neural Networks
  • Hardware-in-the-loop Holography
  • Zero-shot and Simulation-only Learning
  • Object recognition in extreme conditions
  • 3D Scene Representations for View Generation and Inverse Problems
  • Long-range Imaging in Scattering Media
  • Hardware-in-the-loop Illumination and Sensor Optimization
  • Inverse Lidar Design
  • Phase Retrieval Algorithms
  • Proximal Algorithms for Learning and Inference
  • Domain-Specific Language for Optics Design

Peter Henderson , 302 Sherrerd Hall

  • Research Areas: Machine learning, law, and policy

Kyle Jamieson, Room 306

  • Research areas: Wireless and mobile networking; indoor radar and indoor localization; Internet of Things
  • See other topics on my independent work  ideas page  (campus IP and CS dept. login req'd)

Alan Kaplan, 221 Nassau Street, Room 105

Research Areas:

  • Random apps of kindness - mobile application/technology frameworks used to help individuals or communities; topic areas include, but are not limited to: first response, accessibility, environment, sustainability, social activism, civic computing, tele-health, remote learning, crowdsourcing, etc.
  • Tools automating programming language interoperability - Java/C++, React Native/Java, etc.
  • Software visualization tools for education
  • Connected consumer devices, applications and protocols

Brian Kernighan, Room 311

  • Research Areas: application-specific languages, document preparation, user interfaces, software tools, programming methodology
  • Application-oriented languages, scripting languages.
  • Tools; user interfaces
  • Digital humanities

Zachary Kincaid, Room 219

  • Research areas: programming languages, program analysis, program verification, automated reasoning
  • Independent Research Topics:
  • Develop a practical algorithm for an intractable problem (e.g., by developing practical search heuristics, or by reducing to, or by identifying a tractable sub-problem, ...).
  • Design a domain-specific programming language, or prototype a new feature for an existing language.
  • Any interesting project related to programming languages or logic.

Gillat Kol, Room 316

Aleksandra korolova, 309 sherrerd hall.

  • Research areas: Societal impacts of algorithms and AI; privacy; fair and privacy-preserving machine learning; algorithm auditing.

Advisees typically have taken one or more of COS 226, COS 324, COS 423, COS 424 or COS 445.

Pravesh Kothari, Room 320

  • Research areas: Theory

Amit Levy, Room 307

  • Research Areas: Operating Systems, Distributed Systems, Embedded Systems, Internet of Things
  • Distributed hardware testing infrastructure
  • Second factor security tokens
  • Low-power wireless network protocol implementation
  • USB device driver implementation

Kai Li, Room 321

  • Research Areas: Distributed systems; storage systems; content-based search and data analysis of large datasets.
  • Fast communication mechanisms for heterogeneous clusters.
  • Approximate nearest-neighbor search for high dimensional data.
  • Data analysis and prediction of in-patient medical data.
  • Optimized implementation of classification algorithms on manycore processors.

Xiaoyan Li, 221 Nassau Street, Room 104

  • Research areas: Information retrieval, novelty detection, question answering, AI, machine learning and data analysis.
  • Explore new statistical retrieval models for document retrieval and question answering.
  • Apply AI in various fields.
  • Apply supervised or unsupervised learning in health, education, finance, and social networks, etc.
  • Any interesting project related to AI, machine learning, and data analysis.

Lydia Liu, Room 414

  • Research Areas: algorithmic decision making, machine learning and society
  • Theoretical foundations for algorithmic decision making (e.g. mathematical modeling of data-driven decision processes, societal level dynamics)
  • Societal impacts of algorithms and AI through a socio-technical lens (e.g. normative implications of worst case ML metrics, prediction and model arbitrariness)
  • Machine learning for social impact domains, especially education (e.g. responsible development and use of LLMs for education equity and access)
  • Evaluation of human-AI decision making using statistical methods (e.g. causal inference of long term impact)

Wyatt Lloyd, Room 323

  • Research areas: Distributed Systems
  • Caching algorithms and implementations
  • Storage systems
  • Distributed transaction algorithms and implementations

Alex Lombardi , Room 312

  • Research Areas: Theory

Margaret Martonosi, Room 208

  • Quantum Computing research, particularly related to architecture and compiler issues for QC.
  • Computer architectures specialized for modern workloads (e.g., graph analytics, machine learning algorithms, mobile applications
  • Investigating security and privacy vulnerabilities in computer systems, particularly IoT devices.
  • Other topics in computer architecture or mobile / IoT systems also possible.

Jonathan Mayer, Sherrerd Hall, Room 307 

Available for Spring 2025 single-semester IW, only

  • Research areas: Technology law and policy, with emphasis on national security, criminal procedure, consumer privacy, network management, and online speech.
  • Assessing the effects of government policies, both in the public and private sectors.
  • Collecting new data that relates to government decision making, including surveying current business practices and studying user behavior.
  • Developing new tools to improve government processes and offer policy alternatives.

Mae Milano, Room 307

  • Local-first / peer-to-peer systems
  • Wide-ares storage systems
  • Consistency and protocol design
  • Type-safe concurrency
  • Language design
  • Gradual typing
  • Domain-specific languages
  • Languages for distributed systems

Andrés Monroy-Hernández, Room 405

  • Research Areas: Human-Computer Interaction, Social Computing, Public-Interest Technology, Augmented Reality, Urban Computing
  • Research interests:developing public-interest socio-technical systems.  We are currently creating alternatives to gig work platforms that are more equitable for all stakeholders. For instance, we are investigating the socio-technical affordances necessary to support a co-op food delivery network owned and managed by workers and restaurants. We are exploring novel system designs that support self-governance, decentralized/federated models, community-centered data ownership, and portable reputation systems.  We have opportunities for students interested in human-centered computing, UI/UX design, full-stack software development, and qualitative/quantitative user research.
  • Beyond our core projects, we are open to working on research projects that explore the use of emerging technologies, such as AR, wearables, NFTs, and DAOs, for creative and out-of-the-box applications.

Christopher Moretti, Corwin Hall, Room 036

  • Research areas: Distributed systems, high-throughput computing, computer science/engineering education
  • Expansion, improvement, and evaluation of open-source distributed computing software.
  • Applications of distributed computing for "big science" (e.g. biometrics, data mining, bioinformatics)
  • Software and best practices for computer science education and study, especially Princeton's 126/217/226 sequence or MOOCs development
  • Sports analytics and/or crowd-sourced computing

Radhika Nagpal, F316 Engineering Quadrangle

  • Research areas: control, robotics and dynamical systems

Karthik Narasimhan, Room 422

  • Research areas: Natural Language Processing, Reinforcement Learning
  • Autonomous agents for text-based games ( https://www.microsoft.com/en-us/research/project/textworld/ )
  • Transfer learning/generalization in NLP
  • Techniques for generating natural language
  • Model-based reinforcement learning

Arvind Narayanan, 308 Sherrerd Hall 

Research Areas: fair machine learning (and AI ethics more broadly), the social impact of algorithmic systems, tech policy

Pedro Paredes, Corwin Hall, Room 041

My primary research work is in Theoretical Computer Science.

 * Research Interest: Spectral Graph theory, Pseudorandomness, Complexity theory, Coding Theory, Quantum Information Theory, Combinatorics.

The IW projects I am interested in advising can be divided into three categories:

 1. Theoretical research

I am open to advise work on research projects in any topic in one of my research areas of interest. A project could also be based on writing a survey given results from a few papers. Students should have a solid background in math (e.g., elementary combinatorics, graph theory, discrete probability, basic algebra/calculus) and theoretical computer science (226 and 240 material, like big-O/Omega/Theta, basic complexity theory, basic fundamental algorithms). Mathematical maturity is a must.

A (non exhaustive) list of topics of projects I'm interested in:   * Explicit constructions of better vertex expanders and/or unique neighbor expanders.   * Construction deterministic or random high dimensional expanders.   * Pseudorandom generators for different problems.   * Topics around the quantum PCP conjecture.   * Topics around quantum error correcting codes and locally testable codes, including constructions, encoding and decoding algorithms.

 2. Theory informed practical implementations of algorithms   Very often the great advances in theoretical research are either not tested in practice or not even feasible to be implemented in practice. Thus, I am interested in any project that consists in trying to make theoretical ideas applicable in practice. This includes coming up with new algorithms that trade some theoretical guarantees for feasible implementation yet trying to retain the soul of the original idea; implementing new algorithms in a suitable programming language; and empirically testing practical implementations and comparing them with benchmarks / theoretical expectations. A project in this area doesn't have to be in my main areas of research, any theoretical result could be suitable for such a project.

Some examples of areas of interest:   * Streaming algorithms.   * Numeric linear algebra.   * Property testing.   * Parallel / Distributed algorithms.   * Online algorithms.    3. Machine learning with a theoretical foundation

I am interested in projects in machine learning that have some mathematical/theoretical, even if most of the project is applied. This includes topics like mathematical optimization, statistical learning, fairness and privacy.

One particular area I have been recently interested in is in the area of rating systems (e.g., Chess elo) and applications of this to experts problems.

Final Note: I am also willing to advise any project with any mathematical/theoretical component, even if it's not the main one; please reach out via email to chat about project ideas.

Iasonas Petras, Corwin Hall, Room 033

  • Research Areas: Information Based Complexity, Numerical Analysis, Quantum Computation.
  • Prerequisites: Reasonable mathematical maturity. In case of a project related to Quantum Computation a certain familiarity with quantum mechanics is required (related courses: ELE 396/PHY 208).
  • Possible research topics include:

1.   Quantum algorithms and circuits:

  • i. Design or simulation quantum circuits implementing quantum algorithms.
  • ii. Design of quantum algorithms solving/approximating continuous problems (such as Eigenvalue problems for Partial Differential Equations).

2.   Information Based Complexity:

  • i. Necessary and sufficient conditions for tractability of Linear and Linear Tensor Product Problems in various settings (for example worst case or average case). 
  • ii. Necessary and sufficient conditions for tractability of Linear and Linear Tensor Product Problems under new tractability and error criteria.
  • iii. Necessary and sufficient conditions for tractability of Weighted problems.
  • iv. Necessary and sufficient conditions for tractability of Weighted Problems under new tractability and error criteria.

3. Topics in Scientific Computation:

  • i. Randomness, Pseudorandomness, MC and QMC methods and their applications (Finance, etc)

Yuri Pritykin, 245 Carl Icahn Lab

  • Research interests: Computational biology; Cancer immunology; Regulation of gene expression; Functional genomics; Single-cell technologies.
  • Potential research projects: Development, implementation, assessment and/or application of algorithms for analysis, integration, interpretation and visualization of multi-dimensional data in molecular biology, particularly single-cell and spatial genomics data.

Benjamin Raphael, Room 309  

  • Research interests: Computational biology and bioinformatics; Cancer genomics; Algorithms and machine learning approaches for analysis of large-scale datasets
  • Implementation and application of algorithms to infer evolutionary processes in cancer
  • Identifying correlations between combinations of genomic mutations in human and cancer genomes
  • Design and implementation of algorithms for genome sequencing from new DNA sequencing technologies
  • Graph clustering and network anomaly detection, particularly using diffusion processes and methods from spectral graph theory

Vikram Ramaswamy, 035 Corwin Hall

  • Research areas: Interpretability of AI systems, Fairness in AI systems, Computer vision.
  • Constructing a new method to explain a model / create an interpretable by design model
  • Analyzing a current model / dataset to understand bias within the model/dataset
  • Proposing new fairness evaluations
  • Proposing new methods to train to improve fairness
  • Developing synthetic datasets for fairness / interpretability benchmarks
  • Understanding robustness of models

Ran Raz, Room 240

  • Research Area: Computational Complexity
  • Independent Research Topics: Computational Complexity, Information Theory, Quantum Computation, Theoretical Computer Science

Szymon Rusinkiewicz, Room 406

  • Research Areas: computer graphics; computer vision; 3D scanning; 3D printing; robotics; documentation and visualization of cultural heritage artifacts
  • Research ways of incorporating rotation invariance into computer visiontasks such as feature matching and classification
  • Investigate approaches to robust 3D scan matching
  • Model and compensate for imperfections in 3D printing
  • Given a collection of small mobile robots, apply control policies learned in simulation to the real robots.

Olga Russakovsky, Room 408

  • Research Areas: computer vision, machine learning, deep learning, crowdsourcing, fairness&bias in AI
  • Design a semantic segmentation deep learning model that can operate in a zero-shot setting (i.e., recognize and segment objects not seen during training)
  • Develop a deep learning classifier that is impervious to protected attributes (such as gender or race) that may be erroneously correlated with target classes
  • Build a computer vision system for the novel task of inferring what object (or part of an object) a human is referring to when pointing to a single pixel in the image. This includes both collecting an appropriate dataset using crowdsourcing on Amazon Mechanical Turk, creating a new deep learning formulation for this task, and running extensive analysis of both the data and the model

Sebastian Seung, Princeton Neuroscience Institute, Room 153

  • Research Areas: computational neuroscience, connectomics, "deep learning" neural networks, social computing, crowdsourcing, citizen science
  • Gamification of neuroscience (EyeWire  2.0)
  • Semantic segmentation and object detection in brain images from microscopy
  • Computational analysis of brain structure and function
  • Neural network theories of brain function

Jaswinder Pal Singh, Room 324

  • Research Areas: Boundary of technology and business/applications; building and scaling technology companies with special focus at that boundary; parallel computing systems and applications: parallel and distributed applications and their implications for software and architectural design; system software and programming environments for multiprocessors.
  • Develop a startup company idea, and build a plan/prototype for it.
  • Explore tradeoffs at the boundary of technology/product and business/applications in a chosen area.
  • Study and develop methods to infer insights from data in different application areas, from science to search to finance to others. 
  • Design and implement a parallel application. Possible areas include graphics, compression, biology, among many others. Analyze performance bottlenecks using existing tools, and compare programming models/languages.
  • Design and implement a scalable distributed algorithm.

Mona Singh, Room 420

  • Research Areas: computational molecular biology, as well as its interface with machine learning and algorithms.
  • Whole and cross-genome methods for predicting protein function and protein-protein interactions.
  • Analysis and prediction of biological networks.
  • Computational methods for inferring specific aspects of protein structure from protein sequence data.
  • Any other interesting project in computational molecular biology.

Robert Tarjan, 194 Nassau St., Room 308

  • Research Areas: Data structures; graph algorithms; combinatorial optimization; computational complexity; computational geometry; parallel algorithms.
  • Implement one or more data structures or combinatorial algorithms to provide insight into their empirical behavior.
  • Design and/or analyze various data structures and combinatorial algorithms.

Olga Troyanskaya, Room 320

  • Research Areas: Bioinformatics; analysis of large-scale biological data sets (genomics, gene expression, proteomics, biological networks); algorithms for integration of data from multiple data sources; visualization of biological data; machine learning methods in bioinformatics.
  • Implement and evaluate one or more gene expression analysis algorithm.
  • Develop algorithms for assessment of performance of genomic analysis methods.
  • Develop, implement, and evaluate visualization tools for heterogeneous biological data.

David Walker, Room 211

  • Research Areas: Programming languages, type systems, compilers, domain-specific languages, software-defined networking and security
  • Independent Research Topics:  Any other interesting project that involves humanitarian hacking, functional programming, domain-specific programming languages, type systems, compilers, software-defined networking, fault tolerance, language-based security, theorem proving, logic or logical frameworks.

Shengyi Wang, Postdoctoral Research Associate, Room 216

Available for Fall 2024 single-semester IW, only

  • Independent Research topics: Explore Escher-style tilings using (introductory) group theory and automata theory to produce beautiful pictures.

Kevin Wayne, Corwin Hall, Room 040

  • Research Areas: design, analysis, and implementation of algorithms; data structures; combinatorial optimization; graphs and networks.
  • Design and implement computer visualizations of algorithms or data structures.
  • Develop pedagogical tools or programming assignments for the computer science curriculum at Princeton and beyond.
  • Develop assessment infrastructure and assessments for MOOCs.

Matt Weinberg, 194 Nassau St., Room 222

  • Research Areas: algorithms, algorithmic game theory, mechanism design, game theoretical problems in {Bitcoin, networking, healthcare}.
  • Theoretical questions related to COS 445 topics such as matching theory, voting theory, auction design, etc. 
  • Theoretical questions related to incentives in applications like Bitcoin, the Internet, health care, etc. In a little bit more detail: protocols for these systems are often designed assuming that users will follow them. But often, users will actually be strictly happier to deviate from the intended protocol. How should we reason about user behavior in these protocols? How should we design protocols in these settings?

Huacheng Yu, Room 310

  • data structures
  • streaming algorithms
  • design and analyze data structures / streaming algorithms
  • prove impossibility results (lower bounds)
  • implement and evaluate data structures / streaming algorithms

Ellen Zhong, Room 314

Opportunities outside the department.

We encourage students to look in to doing interdisciplinary computer science research and to work with professors in departments other than computer science.  However, every CS independent work project must have a strong computer science element (even if it has other scientific or artistic elements as well.)  To do a project with an adviser outside of computer science you must have permission of the department.  This can be accomplished by having a second co-adviser within the computer science department or by contacting the independent work supervisor about the project and having he or she sign the independent work proposal form.

Here is a list of professors outside the computer science department who are eager to work with computer science undergraduates.

Maria Apostolaki, Engineering Quadrangle, C330

  • Research areas: Computing & Networking, Data & Information Science, Security & Privacy

Branko Glisic, Engineering Quadrangle, Room E330

  • Documentation of historic structures
  • Cyber physical systems for structural health monitoring
  • Developing virtual and augmented reality applications for documenting structures
  • Applying machine learning techniques to generate 3D models from 2D plans of buildings
  •  Contact : Rebecca Napolitano, rkn2 (@princeton.edu)

Mihir Kshirsagar, Sherrerd Hall, Room 315

Center for Information Technology Policy.

  • Consumer protection
  • Content regulation
  • Competition law
  • Economic development
  • Surveillance and discrimination

Sharad Malik, Engineering Quadrangle, Room B224

Select a Senior Thesis Adviser for the 2020-21 Academic Year.

  • Design of reliable hardware systems
  • Verifying complex software and hardware systems

Prateek Mittal, Engineering Quadrangle, Room B236

  • Internet security and privacy 
  • Social Networks
  • Privacy technologies, anonymous communication
  • Network Science
  • Internet security and privacy: The insecurity of Internet protocols and services threatens the safety of our critical network infrastructure and billions of end users. How can we defend end users as well as our critical network infrastructure from attacks?
  • Trustworthy social systems: Online social networks (OSNs) such as Facebook, Google+, and Twitter have revolutionized the way our society communicates. How can we leverage social connections between users to design the next generation of communication systems?
  • Privacy Technologies: Privacy on the Internet is eroding rapidly, with businesses and governments mining sensitive user information. How can we protect the privacy of our online communications? The Tor project (https://www.torproject.org/) is a potential application of interest.

Ken Norman,  Psychology Dept, PNI 137

  • Research Areas: Memory, the brain and computation 
  • Lab:  Princeton Computational Memory Lab

Potential research topics

  • Methods for decoding cognitive state information from neuroimaging data (fMRI and EEG) 
  • Neural network simulations of learning and memory

Caroline Savage

Office of Sustainability, Phone:(609)258-7513, Email: cs35 (@princeton.edu)

The  Campus as Lab  program supports students using the Princeton campus as a living laboratory to solve sustainability challenges. The Office of Sustainability has created a list of campus as lab research questions, filterable by discipline and topic, on its  website .

An example from Computer Science could include using  TigerEnergy , a platform which provides real-time data on campus energy generation and consumption, to study one of the many energy systems or buildings on campus. Three CS students used TigerEnergy to create a  live energy heatmap of campus .

Other potential projects include:

  • Apply game theory to sustainability challenges
  • Develop a tool to help visualize interactions between complex campus systems, e.g. energy and water use, transportation and storm water runoff, purchasing and waste, etc.
  • How can we learn (in aggregate) about individuals’ waste, energy, transportation, and other behaviors without impinging on privacy?

Janet Vertesi, Sociology Dept, Wallace Hall, Room 122

  • Research areas: Sociology of technology; Human-computer interaction; Ubiquitous computing.
  • Possible projects: At the intersection of computer science and social science, my students have built mixed reality games, produced artistic and interactive installations, and studied mixed human-robot teams, among other projects.

David Wentzlaff, Engineering Quadrangle, Room 228

Computing, Operating Systems, Sustainable Computing.

  • Instrument Princeton's Green (HPCRC) data center
  • Investigate power utilization on an processor core implemented in an FPGA
  • Dismantle and document all of the components in modern electronics. Invent new ways to build computers that can be recycled easier.
  • Other topics in parallel computer architecture or operating systems

Facebook

Banner

Computer Engineering: Articles & Journals

  • Articles & Journals
  • Books & Ebooks
  • Standards & Patents
  • Citing & Writing Help
  • Using AI in Research This link opens in a new window

Journal Title Abbreviations

  • All that JAS (Journal Abbreviations Sources)
  • Web of Science Journal Title Abbreviations list shows the abbreviations used for journal titles as cited works

Find Articles in Research Databases

  • Recommended databases
  • Consider these, also
  • OneSearch  
  • Google Scholar
  • What is peer review?

How-to instructions

Depending on your topic, these databases also may be relevant.

Selected full text articles from journals in a wide range of subjects, plus magazines, reports, books, and more

Selected full text articles from journals, magazines, conference proceedings, and more

Database free on the web

Selected full text articles from journals, trade, and popular magazines covering scientific and technical fields

Advanced Search

Sample search:    "critical thinking" AND "higher education"    (race OR racial) AND (discriminat* OR prejudic*) Select limiters (articles from scholarly publications, etc.) on results screen

While it doesn't offer some of the sophisticated options available in research databases, Google Scholar can be helpful. See instructions for customizing Google Scholar to provide Full Text @ UHCL links in results.

Google Scholar Search

A peer-reviewed (or refereed ) journal:

  • uses  experts from the same subject field or profession as the author to evaluate a manuscript prior to acceptance for publication
  • has articles that report on research studies or provide scholarly analysis of topics
  • may include book reviews, editorials, or other brief items that are not considered scholarly articles
  • Anatomy of a Scholarly Article Explains key elements from the first & last page of a typical scholarly or academic article. North Carolina State Univ. Libraries

Peer Review in 3 Minutes

(3:15) Explains the academic publishing process for research articles and scholarly journals, including the quality control process of peer review. North Carolina State Univ. Libraries

Get Full Text for an Article

If a database lacks immediate full text, click Find It @ UHCL  in results to check for full text from another source. Follow Available Online links in OneSearch to a resource with full text for UH Clear Lake users as shown below.

Shows Find It at UHCL leading to full text in EBSCOhost PsycARTICLES

If full text is not found, submit an article request .

  • Request a book, article, or other item ILLiad Interlibrary Loan logon

PDF

Other Sources

Links to reliable websites and research guides.

  • TechRepublic: Research Library Includes webcasts, white papers, file downloads and more with information on various aspects of technology. Also has blogs and forums.
  • CompInfo The Computer Information Center One-stop reference resource for corporate IT, computer software, computers, and communication.
  • The Virtual Library: Electrical and Electronics Engineering The WWW Virtual Library (VL) is the oldest catalogue of the Web, started by Tim Berners-Lee, the creator of HTML and of the Web itself. It is run by a loose confederation of volunteers, who compile pages of key links for areas in which they are expert. The VL pages are widely recognised as being amongst the highest-quality guides to particular sections of the Web.
  • WWW Computer Architecture Page University of Wisconsin page with links to tools, conferences, people and organizations in Computer Architecture.
  • << Previous: Home
  • Next: Books & Ebooks >>
  • Last Updated: Feb 28, 2024 10:46 AM
  • URL: https://uhcl.libguides.com/CENG
  • Research & Faculty
  • Offices & Services
  • Information for:
  • Faculty & Staff
  • News & Events
  • Contact & Visit
  • About the Department
  • Message from the Chair
  • Computer Science Major (BS/BA)
  • Computer Science Minor
  • Data Science and Engineering Minor
  • Combined BS (or BA)/MS Degree Program
  • Intro Courses
  • Special Programs & Opportunities
  • Student Groups & Organizations
  • Undergraduate Programs
  • Undergraduate Research
  • Senior Thesis
  • Peer Mentors
  • Curriculum & Requirements
  • MS in Computer Science
  • PhD in Computer Science
  • Admissions FAQ
  • Financial Aid
  • Graduate Programs
  • Courses Collapse Courses Submenu
  • Research Overview
  • Research Areas
  • Systems and Networking
  • Security and Privacy
  • Programming Languages
  • Artificial Intelligence
  • Human-Computer Interaction
  • Vision and Graphics
  • Groups & Labs
  • Affiliated Centers & Institutes
  • Industry Partnerships
  • Adobe Research Partnership
  • Center for Advancing Safety of Machine Intelligence
  • Submit a Tech Report
  • Tech Reports
  • Tenure-Track Faculty
  • Faculty of Instruction
  • Affiliated Faculty
  • Adjunct Faculty
  • Postdoctoral Fellows
  • PhD Students
  • Outgoing PhDs and Postdocs
  • Visiting Scholars
  • News Archive
  • Weekly Bulletin
  • Monthly Student Newsletter
  • All Public Events
  • Seminars, Workshops, & Talks
  • Distinguished Lecture Series
  • CS Colloquium Series
  • CS + X Events
  • Tech Talk Series
  • Honors & Awards
  • External Faculty Awards
  • University Awards
  • Department Awards
  • Student Resources
  • Undergraduate Student Resources
  • MS Student Resources
  • PhD Student Resources
  • Student Organization Resources
  • Faculty Resources
  • Postdoc Resources
  • Staff Resources
  • Purchasing, Procurement and Vendor Payment
  • Expense Reimbursements
  • Department Operations and Facilities
  • Initiatives
  • Student Groups
  • CS Faculty Diversity Committee
  • Broadening Participation in Computing (BPC) Plan
  • Northwestern Engineering

Research Research Areas

Research areas represent the major research activities in the Department of Computer Science. Faculty and students have developed new ideas to achieve results in all aspects of the nine areas of research.

Choose a research area below to learn more:

  • Artificial Intelligence and Machine Learning
  • Human-Computer Interaction and Information Visualization
  • Computer Engineering (in collaboration with the Electrical and Computer Engineering Department)

More in this section

  • Engineering Home
  • CS Department

Related Links

  • Research at McCormick
  • Meet our Faculty
  • Northwestern Research Overview

Contact Info

Samir Khuller Chair and Professor Phone: 847-491-2748 Email Samir

  • Graduate Programs
  • Undergraduate Programs
  • Schedule a tour

Strategic Research Areas

  • Research Groups, Centers and Labs
  • Undergraduate Research Opportunities
  • Executive Leadership
  • Administrative Staff
  • Faculty Awards and Honors
  • Resources and Groups for ECE Women
  • ECE Advisory Council
  • ECE Connections
  • Giving Opportunities
  • Ways to Give
  • Academic Support
  • Financial Support
  • Mental Health Resources
  • Experience and Employment
  • Undergraduate Services
  • Graduate Services and Activities

Research in Electrical and Computer Engineering covers an extremely broad range of topics. Whether in computer architecture, energy and power systems or in nanotechnology devices, the research conducted in ECE is at the cutting edge of technological and scientific developments. 

Image of a computer chip

  • Computer Engineering

Computer engineering concerns itself with the understanding and design of hardware needed to carry out computation, as well as the hardware-software interface. It is sometimes said that computer engineering is the nexus that connects electrical engineering and computer science. Research and teaching areas with a significant computer engineering component include digital logic and VLSI design, computer architecture and organization, embedded systems and Internet of things, virtualization and operating systems, code generation and optimization, computer networks and data centers, electronic design automation, or robotics.

Related Research Areas

  • Artificial Intelligence
  • Complex Systems, Network Science and Computation
  • Computer Architecture
  • Computer Systems
  • Data Mining
  • Energy and the Environment
  • Rapid Prototyping

Robotics and Autonomy

  • Scientific Computing
  • Sensors and Actuators
  • Signal and Image Processing
  • Statistics and Machine Learning

Robotics and Autonomy image

Robotics at Cornell spans various subareas, including perception, control, learning, planning, and human-robot interaction. We work with a variety of robots such as aerial robots, home and office assistant robots, autonomous cars, humanoids, evolutionary robots, legged robots, snake robots and more. The Collective Embodied Intelligence Lab  works to design and coordination of large robot collectives able to achieve complex behaviors beyond the reach of single robot systems, and corresponding studies on how social insects do so in nature. Major research topics include swarm intelligence, embodied intelligence, autonomous construction, bio-cyber physical systems, human-swarm interaction, and soft robots.

Visit the the  Cornell Engineering Robotics Website  for more.

  • Integrated Circuits
  • Power Electronics
  • Robotics and Autonomy
  • Systems and Networking

People network image

  • Information, Networks, and Decision Systems

This research area focuses on the advancement of research and education in the information, learning, network, and decision sciences. Our research is at the frontier of a wide range of fields and applications, including machine learning and signal processing, optimization and control theory, information theory and coding, power systems and electricity markets, network science, and game theory. The work encompasses theory and practice, with the overarching objective of developing the mathematical underpinnings and tools needed to address some of the most pressing challenges facing society today in energy and climate change, transportation, social networks, and human health. In particular, the Foundations of Information, Networks, and Decision Systems (FIND) group comprises a vibrant community of faculty, postdocs, and students dedicated to developing the mathematical underpinnings and tools needed to address the aforementioned challenges in a principled and theory-guided manner.

  • Biotechnology
  • Computational Science and Engineering
  • Energy Systems
  • Image Analysis
  • Information Theory and Communications
  • Optimization
  • Remote Sensing

Wire array load image

  • Physical Electronics, Devices, and Plasma Science

Work in this area applies the physics of electromagnetism, quantum mechanics, and the solid state to implement devices and systems for applications including energy, quantum technologies, sensing, communication, and computation. Our efforts span theory and development of new electronic and optical devices and materials, micro-electromechanical systems, acoustic and optical sensing and imaging, quantum control of individual atoms near absolute zero temperature, and experiments on high-energy plasmas at temperatures close to those at the center of the sun.    At Cornell ECE, we work on diverse topics aimed at transforming the way we view the world. Our interdisciplinary research reveals fundamental similarities across problems and prompts new research into some of the most exciting and cutting-edge developments in the field.

  • Advanced Materials Processing
  • Astrophysics, Fusion and Plasma Physics
  • High Energy Density, Plasma Physics and Electromagnetics
  • Materials Synthesis and Processing
  • Microfluidics and Microsystems
  • Nanotechnology
  • Photonics and Optoelectronics
  • Semiconductor Physics and Devices
  • Solid State, Electronics, Optoelectronics and MEMs

Chip circuit image

  • Circuits and Electronic Systems

Integrated circuits are ubiquitous and integral to everyday devices, from cellular phones and home appliances to automobiles and satellites. Healthcare, communications, consumer electronics, high-performance scientific computing, and many other fields are creating tremendous new opportunities for innovation in circuits and electronic systems at every level. Research in this area spans topics including analog and mixed signal circuits, RF transceivers, low power interfaces, power electronics and wireless power transfer, and many others. 

  • Micro Nano Systems
  • Optical Physics and Quantum Information Science

Digital brain image

  • Bio-Electrical Engineering

Biological and Biomedical Electrical Engineering (B2E2) consists of both applied and fundamental work to understand the complexity of biological systems at different scales, e.g., from a single neuronal or cancer cell, all the way to the brain or malignant tumor. B2E2 aims to develop new hardware and computational tools to identify, characterize, and treat diseases. In the physical domain, electrical engineering approaches to integrated microsystems lead to new biological and medical sensors. These sensors consist of state-of-the-art ultrasonic, RF, optical, MRI, CT, electrical impedance transducers. 

The integration of sensors, electronics are used to develop implantable and wearable devices, with decreasing size, weight, and power and increased functionality. B2E2 microsystems can help create interfaces for sensing and actuation to help understand the physiological and pathological mechanisms of diseases, and enable advanced robotic interfaces in medicine. Medical devices can generate vast amounts of data, which require both real-time and post-acquisition processing. B2E2 faculty, sometimes in collaboration with medical researchers, develop advanced computational tools to learn from and exploit data and apply artificial intelligence approaches to impact medical practice by improving: early disease detection, disease diagnosis, response to therapy assessment, and guided surgical procedures.

  • Biomedical Imaging and Instrumentation
  • Complex Systems, Network Science and Technology
  • Computer-Aided Diagnosis
  • Nanobio Applications
  • Neuroscience

Computer Graphic

Hardware That Protects Against Software Attacks

ECE's Ed Suh and Zhiru Zhang and CS's Andrew C. Myers aim to develop both hardware architecture and design tools to provide comprehensive and provable security assurance for future computing systems against software-level attacks that exploit seven common vulnerability classes.

Image credit Beatrice Jin

Computer Graphic

Re-architecting Next-Gen Computing Systems

Disaggregated architectures have the potential to increase resource capacity by 10 to 100 times server-centric architectures.

Computer Graphic

Re-imagining Computer System Memories

Interdisciplinary team will provide new insights and an entirely new paradigm for the semiconductor industry in the emerging era of big data.

The Martinez and Zhang Research Groups

Engineers to hack 50-year-old computing problem with new center

Cornell engineers are part of a national effort to reinvent computing by developing new solutions to the “von Neumann bottleneck,” a feature-turned-problem that is almost as old as the modern computer itself.

Professors Dave Hammer and Bruce Kusse looking at the COBRA machine

The Laboratory of Plasma Studies: Uncovering mysteries of high energy density plasma physics

In the basement of Grumman Hall, an x-ray pulse produced by a hot, dense plasma – an ionized gas – lasting only fractions of a microsecond both begins and ends an experiment. Hidden within that fraction of time lies a piece of a puzzle—data that graduate students and staff scientists at the Laboratory of Plasma Studies (LPS) will use to better understand the mysterious physics behind inertial confinement fusion.

Sophia Rocco working on the COBRA machine

Sophia Rocco: Hoping to make the world a better place through a potential renewable energy source

When she was looking at graduate schools, physics major Sophia Rocco thought she would be in a materials science program bridging her interests in electricity and magnetism and novel materials for solar cells. Chancing upon the School of Electrical and Computer Engineering at Cornell, she discovered the Laboratory of Plasma Studies (LPS).

The Laboratory of Plasma Studies with the COBRA machine in the foreground and students in the background

Finding the Ultimate Energy Source: Cornell’s Lab of Plasma Studies

Plasma is one of the four fundamental states of matter, but it does not exist freely on the Earth’s surface. It must be artificially generated by heating or subjecting a neutral gas to a strong electromagnetic field. Located in the basement of Grumman Hall are two large pulse-power generators that create plasma by delivering extremely high currents to ordinary matter for short periods. These generators are part of the  Lab of Plasma Studies  at Cornell University.

Photo credit: Dave Burbank

A schematic, left, of a gallium oxide vertical power field-effect transistor, and a scanning electron microscope image, right, of the transistor, showing a 330-nanometer-wide by 795-nanometer-long channel.

Vertical gallium oxide transistor high in power, efficiency

The research group led by Grace Xing and Debdeep Jena presented research on a new gallium oxide field-effect transistor at a conference at the Massachusetts Institute of Technology May 29-June 1.

Molnar, Xing and Jena

Molnar, Jena and Xing join national consortium to develop future cellular infrastructure

Three Cornell faculty will be part of the newly established $27.5 million ComSenTer, a center for converged terahertz communications and sensing.

Faculty members associated with Cornell NeuroNex

Data on the Brain

The NSF has found a willing partner at Cornell University in this quest to create technologies that will allow researchers to image the brain and the nervous system.

Computer Technology Research Paper Topics

Academic Writing Service

This list of computer technology research paper topics  provides the list of 33 potential topics for research papers and an overview article on the history of computer technology.

1. Analog Computers

Paralleling the split between analog and digital computers, in the 1950s the term analog computer was a posteriori projected onto pre-existing classes of mechanical, electrical, and electromechanical computing artifacts, subsuming them under the same category. The concept of analog, like the technical demarcation between analog and digital computer, was absent from the vocabulary of those classifying artifacts for the 1914 Edinburgh Exhibition, the first world’s fair emphasizing computing technology, and this leaves us with an invaluable index of the impressive number of classes of computing artifacts amassed during the few centuries of capitalist modernity. True, from the debate between ‘‘smooth’’ and ‘‘lumpy’’ artificial lines of computing (1910s) to the differentiation between ‘‘continuous’’ and ‘‘cyclic’’ computers (1940s), the subsequent analog–digital split became possible by the multitudinous accumulation of attempts to decontextualize the computer from its socio-historical use alternately to define the ideal computer technically. The fact is, however, that influential classifications of computing technology from the previous decades never provided an encompassing demarcation compared to the analog– digital distinction used since the 1950s. Historians of the digital computer find that the experience of working with software was much closer to art than science, a process that was resistant to mass production; historians of the analog computer find this to have been typical of working with the analog computer throughout all its aspects. The historiography of the progress of digital computing invites us to turn to the software crisis, which perhaps not accidentally, surfaced when the crisis caused by the analog ended. Noticeably, it was not until the process of computing with a digital electronic computer became sufficiently visual by the addition of a special interface—to substitute for the loss of visualization that was previously provided by the analog computer—that the analog computer finally disappeared.

Academic Writing, Editing, Proofreading, And Problem Solving Services

Get 10% off with 24start discount code, 2. artificial intelligence.

Artificial intelligence (AI) is the field of software engineering that builds computer systems and occasionally robots to perform tasks that require intelligence. The term ‘‘artificial intelligence’’ was coined by John McCarthy in 1958, then a graduate student at Princeton, at a summer workshop held at Dartmouth in 1956. This two-month workshop marks the official birth of AI, which brought together young researchers who would nurture the field as it grew over the next several decades: Marvin Minsky, Claude Shannon, Arthur Samuel, Ray Solomonoff, Oliver Selfridge, Allen Newell, and Herbert Simon. It would be difficult to argue that the technologies derived from AI research had a profound effect on our way of life by the beginning of the 21st century. However, AI technologies have been successfully applied in many industrial settings, medicine and health care, and video games. Programming techniques developed in AI research were incorporated into more widespread programming practices, such as high-level programming languages and time-sharing operating systems. While AI did not succeed in constructing a computer which displays the general mental capabilities of a typical human, such as the HAL computer in Arthur C. Clarke and Stanley Kubrick’s film 2001: A Space Odyssey, it has produced programs that perform some apparently intelligent tasks, often at a much greater level of skill and reliability than humans. More than this, AI has provided a powerful and defining image of what computer technology might someday be capable of achieving.

3. Computer and Video Games

Interactive computer and video games were first developed in laboratories as the late-night amusements of computer programmers or independent projects of television engineers. Their formats include computer software; networked, multiplayer games on time-shared systems or servers; arcade consoles; home consoles connected to television sets; and handheld game machines. The first experimental projects grew out of early work in computer graphics, artificial intelligence, television technology, hardware and software interface development, computer-aided education, and microelectronics. Important examples were Willy Higinbotham’s oscilloscope-based ‘‘Tennis for Two’’ at the Brookhaven National Laboratory (1958); ‘‘Spacewar!,’’ by Steve Russell, Alan Kotok, J. Martin Graetz and others at the Massachusetts Institute of Technology (1962); Ralph Baer’s television-based tennis game for Sanders Associates (1966); several networked games from the PLATO (Programmed Logic for Automatic Teaching Operations) Project at the University of Illinois during the early 1970s; and ‘‘Adventure,’’ by Will Crowther of Bolt, Beranek & Newman (1972), extended by Don Woods at Stanford University’s Artificial Intelligence Laboratory (1976). The main lines of development during the 1970s and early 1980s were home video consoles, coin-operated arcade games, and computer software.

4. Computer Displays

The display is an essential part of any general-purpose computer. Its function is to act as an output device to communicate data to humans using the highest bandwidth input system that humans possess—the eyes. Much of the development of computer displays has been about trying to get closer to the limits of human visual perception in terms of color and spatial resolution. Mainframe and minicomputers used ‘‘terminals’’ to display the output. These were fed data from the host computer and processed the data to create screen images using a graphics processor. The display was typically integrated with a keyboard system and some communication hardware as a terminal or video display unit (VDU) following the basic model used for teletypes. Personal computers (PCs) in the late 1970s and early 1980s changed this model by integrating the graphics controller into the computer chassis itself. Early PC displays typically displayed only monochrome text and communicated in character codes such as ASCII. Line-scanning frequencies were typically from 15 to 20 kilohertz—similar to television. CRT displays rapidly developed after the introduction of video graphics array (VGA) technology (640 by 480 pixels in16 colors) in the mid-1980s and scan frequencies rose to 60 kilohertz or more for mainstream displays; 100 kilohertz or more for high-end displays. These displays were capable of displaying formats up to 2048 by 1536 pixels with high color depths. Because the human eye is very quick to respond to visual stimulation, developments in display technology have tended to track the development of semiconductor technology that allows the rapid manipulation of the stored image.

5. Computer Memory for Personal Computers

During the second half of the twentieth century, the two primary methods used for the long-term storage of digital information were magnetic and optical recording. These methods were selected primarily on the basis of cost. Compared to core or transistorized random-access memory (RAM), storage costs for magnetic and optical media were several orders of magnitude cheaper per bit of information and were not volatile; that is, the information did not vanish when electrical power was turned off. However, access to information stored on magnetic and optical recorders was much slower compared to RAM memory. As a result, computer designers used a mix of both types of memory to accomplish computational tasks. Designers of magnetic and optical storage systems have sought meanwhile to increase the speed of access to stored information to increase the overall performance of computer systems, since most digital information is stored magnetically or optically for reasons of cost.

6. Computer Modeling

Computer simulation models have transformed the natural, engineering, and social sciences, becoming crucial tools for disciplines as diverse as ecology, epidemiology, economics, urban planning, aerospace engineering, meteorology, and military operations. Computer models help researchers study systems of extreme complexity, predict the behavior of natural phenomena, and examine the effects of human interventions in natural processes. Engineers use models to design everything from jets and nuclear-waste repositories to diapers and golf clubs. Models enable astrophysicists to simulate supernovas, biochemists to replicate protein folding, geologists to predict volcanic eruptions, and physiologists to identify populations at risk of lead poisoning. Clearly, computer models provide a powerful means of solving problems, both theoretical and applied.

7. Computer Networks

Computers and computer networks have changed the way we do almost everything—the way we teach, learn, do research, access or share information, communicate with each other, and even the way we entertain ourselves. A computer network, in simple terms, consists of two or more computing devices (often called nodes) interconnected by means of some medium capable of transmitting data that allows the computers to communicate with each other in order to provide a variety of services to users.

8. Computer Science

Computer science occupies a unique position among the scientific and technical disciplines. It revolves around a specific artifact—the electronic digital computer—that touches upon a broad and diverse set of fields in its design, operation, and application. As a result, computer science represents a synthesis and extension of many different areas of mathematics, science, engineering, and business.

9. Computer-Aided Control Technology

The story of computer-aided control technology is inextricably entwined with the modern history of automation. Automation in the first half of the twentieth century involved (often analog) processes for continuous automatic measurement and control of hardware by hydraulic, mechanical, or electromechanical means. These processes facilitated the development and refinement of battlefield fire-control systems, feedback amplifiers for use in telephony, electrical grid simulators, numerically controlled milling machines, and dozens of other innovations.

10. Computer-Aided Design and Manufacture

Computer-aided design and manufacture, known by the acronym CAD/CAM, is a process for manufacturing mechanical components, wherein computers are used to link the information needed in and produced by the design process to the information needed to control the machine tools that produce the parts. However, CAD/CAM actually constitutes two separate technologies that developed along similar, but unrelated, lines until they were combined in the 1970s.

11. Computer-User Interface

A computer interface is the point of contact between a person and an electronic computer. Today’s interfaces include a keyboard, mouse, and display screen. Computer user interfaces developed through three distinct stages, which can be identified as batch processing, interactive computing, and the graphical user interface (GUI). Today’s graphical interfaces support additional multimedia features, such as streaming audio and video. In GUI design, every new software feature introduces more icons into the process of computer– user interaction. Presently, the large vocabulary of icons used in GUI design is difficult for users to remember, which creates a complexity problem. As GUIs become more complex, interface designers are adding voice recognition and intelligent agent technologies to make computer user interfaces even easier to operate.

12. Early Computer Memory

Mechanisms to store information were present in early mechanical calculating machines, going back to Charles Babbage’s analytical engine proposed in the 1830s. It introduced the concept of the ‘‘store’’ and, if ever built, would have held 1000 numbers of up to 50 decimal digits. However, the move toward base-2 or binary computing in the 1930s brought about a new paradigm in technology—the digital computer, whose most elementary component was an on–off switch. Information on a digital system is represented using a combination of on and off signals, stored as binary digits (shortened to bits): zeros and ones. Text characters, symbols, or numerical values can all be coded as bits, so that information stored in digital memory is just zeros and ones, regardless of the storage medium. The history of computer memory is closely linked to the history of computers but a distinction should be made between primary (or main) and secondary memory. Computers only need operate on one segment of data at a time, and with memory being a scarce resource, the rest of the data set could be stored in less expensive and more abundant secondary memory.

13. Early Digital Computers

Digital computers were a marked departure from the electrical and mechanical calculating and computing machines in wide use from the early twentieth century. The innovation was of information being represented using only two states (on or off), which came to be known as ‘‘digital.’’ Binary (base 2) arithmetic and logic provided the tools for these machines to perform useful functions. George Boole’s binary system of algebra allowed any mathematical equation to be represented by simply true or false logic statements. By using only two states, engineering was also greatly simplified, and universality and accuracy increased. Further developments from the early purpose-built machines, to ones that were programmable accompanied by many key technological developments, resulted in the well-known success and proliferation of the digital computer.

14. Electronic Control Technology

The advancement of electrical engineering in the twentieth century made a fundamental change in control technology. New electronic devices including vacuum tubes (valves) and transistors were used to replace electromechanical elements in conventional controllers and to develop new types of controllers. In these practices, engineers discovered basic principles of control theory that could be further applied to design electronic control systems.

15. Encryption and Code Breaking

The word cryptography comes from the Greek words for ‘‘hidden’’ (kryptos) and ‘‘to write’’ (graphein)—literally, the science of ‘‘hidden writing.’’ In the twentieth century, cryptography became fundamental to information technology (IT) security generally. Before the invention of the digital computer at mid-century, national governments across the world relied on mechanical and electromechanical cryptanalytic devices to protect their own national secrets and communications, as well as to expose enemy secrets. Code breaking played an important role in both World Wars I and II, and the successful exploits of Polish and British cryptographers and signals intelligence experts in breaking the code of the German Enigma ciphering machine (which had a range of possible transformations between a message and its code of approximately 150 trillion (or 150 million million million) are well documented.

16. Error Checking and Correction

In telecommunications, whether transmission of data or voice signals is over copper, fiber-optic, or wireless links, information coded in the signal transmitted must be decoded by the receiver from a background of noise. Signal errors can be introduced, for example from physical defects in the transmission medium (semiconductor crystal defects, dust or scratches on magnetic memory, bubbles in optical fibers), from electromagnetic interference (natural or manmade) or cosmic rays, or from cross-talk (unwanted coupling) between channels. In digital signal transmission, data is transmitted as ‘‘bits’’ (ones or zeros, corresponding to on or off in electronic circuits). Random bit errors occur singly and in no relation to each other. Burst error is a large, sustained error or loss of data, perhaps caused by transmission problems in the connecting cables, or sudden noise. Analog to digital conversion can also introduce sampling errors.

17. Global Positioning System (GPS)

The NAVSTAR (NAVigation System Timing And Ranging) Global Positioning System (GPS) provides an unlimited number of military and civilian users worldwide with continuous, highly accurate data on their position in four dimensions— latitude, longitude, altitude, and time— through all weather conditions. It includes space, control, and user segments (Figure 6). A constellation of 24 satellites in 10,900 nautical miles, nearly circular orbits—six orbital planes, equally spaced 60 degrees apart, inclined approximately 55 degrees relative to the equator, and each with four equidistant satellites—transmits microwave signals in two different L-band frequencies. From any point on earth, between five and eight satellites are ‘‘visible’’ to the user. Synchronized, extremely precise atomic clocks—rubidium and cesium— aboard the satellites render the constellation semiautonomous by alleviating the need to continuously control the satellites from the ground. The control segment consists of a master facility at Schriever Air Force Base, Colorado, and a global network of automated stations. It passively tracks the entire constellation and, via an S-band uplink, periodically sends updated orbital and clock data to each satellite to ensure that navigation signals received by users remain accurate. Finally, GPS users—on land, at sea, in the air or space—rely on commercially produced receivers to convert satellite signals into position, time, and velocity estimates.

18. Gyrocompass and Inertial Guidance

Before the twentieth century, navigation at sea employed two complementary methods, astronomical and dead reckoning. The former involved direct measurements of celestial phenomena to ascertain position, while the latter required continuous monitoring of a ship’s course, speed, and distance run. New navigational technology was required not only for iron ships in which traditional compasses required correction, but for aircraft and submarines in which magnetic compasses cannot be used. Owing to their rapid motion, aircraft presented challenges for near instantaneous navigation data collection and reduction. Electronics furnished the exploitation of radio and the adaptation of a gyroscope to direction finding through the invention of the nonmagnetic gyrocompass.

Although the Cold War arms race after World War II led to the development of inertial navigation, German manufacture of the V-2 rocket under the direction of Wernher von Braun during the war involved a proto-inertial system, a two-gimballed gyro with an integrator to determine speed. Inertial guidance combines a gyrocompass with accelerometers installed along orthogonal axes, devices that record all accelerations of the vehicle in which inertial guidance has been installed. With this system, if the initial position of the vehicle is known, then the vehicle’s position at any moment is known because integrators record all directions and accelerations and calculate speeds and distance run. Inertial guidance devices can subtract accelerations due to gravity or other motions of the vehicle. Because inertial guidance does not depend on an outside reference, it is the ultimate dead reckoning system, ideal for the nuclear submarines for which they were invented and for ballistic missiles. Their self-contained nature makes them resistant to electronic countermeasures. Inertial systems were first installed in commercial aircraft during the 1960s. The expense of manufacturing inertial guidance mechanisms (and their necessary management by computer) has limited their application largely to military and some commercial purposes. Inertial systems accumulate errors, so their use at sea (except for submarines) has been as an adjunct to other navigational methods, unlike aircraft applications. Only the development of the global positioning system (GPS) at the end of the century promised to render all previous navigational technologies obsolete. Nevertheless, a range of technologies, some dating to the beginning of the century, remain in use in a variety of commercial and leisure applications.

19. Hybrid Computers

Following the emergence of the analog–digital demarcation in the late 1940s—and the ensuing battle between a speedy analog versus the accurate digital—the term ‘‘hybrid computer’’ surfaced in the early 1960s. The assumptions held by the adherents of the digital computer—regarding the dynamic mechanization of computational labor to accompany the equally dynamic increase in computational work—was becoming a universal ideology. From this perspective, the digital computer justly appeared to be technically superior. In introducing the digital computer to social realities, however, extensive interaction with the experienced analog computer adherents proved indispensable, especially given that the digital proponents’ expectation of progress by employing the available and inexpensive hardware was stymied by the lack of inexpensive software. From this perspective—as historiographically unwanted it may be by those who agree with the essentialist conception of the analog–digital demarcation—the history of the hybrid computer suggests that the computer as we now know it was brought about by linking the analog and the digital, not by separating them. Placing the ideal analog and the ideal digital at the two poles, all computing techniques that combined some features of both fell beneath ‘‘hybrid computation’’; the designators ‘‘balanced’’ or ‘‘true’’ were preserved for those built with appreciable amounts of both. True hybrids fell into the middle spectrum that included: pure analog computers, analog computers using digital-type numerical analysis techniques, analog computers programmed with the aid of digital computers, analog computers using digital control and logic, analog computers using digital subunits, analog computers using digital computers as peripheral equipment, balanced hybrid computer systems, digital computers using analog subroutines, digital computers with analog arithmetic elements, digital computers designed to permit analog-type programming, digital computers with analog-oriented compilers and interpreters, and pure digital computers.

20. Information Theory

Information theory, also known originally as the mathematical theory of communication, was first explicitly formulated during the mid-twentieth century. Almost immediately it became a foundation; first, for the more systematic design and utilization of numerous telecommunication and information technologies; and second, for resolving a paradox in thermodynamics. Finally, information theory has contributed to new interpretations of a wide range of biological and cultural phenomena, from organic physiology and genetics to cognitive behavior, human language, economics, and political decision making. Reflecting the symbiosis between theory and practice typical of twentieth century technology, technical issues in early telegraphy and telephony gave rise to a proto-information theory developed by Harry Nyquist at Bell Labs in 1924 and Ralph Hartley, also at Bell Labs, in 1928. This theory in turn contributed to advances in telecommunications, which stimulated the development of information theory per se by Claude Shannon and Warren Weaver, in their book The Mathematical Theory of Communication published in 1949. As articulated by Claude Shannon, a Bell Labs researcher, the technical concept of information is defined by the probability of a specific message or signal being picked out from a number of possibilities and transmitted from A to B. Information in this sense is mathematically quantifiable. The amount of information, I, conveyed by signal, S, is inversely related to its probability, P. That is, the more improbable a message, the more information it contains. To facilitate the mathematical analysis of messages, the measure is conveniently defined as I ¼ log2 1/P(S), and is named a binary digit or ‘‘bit’’ for short. Thus in the simplest case of a two-state signal (1 or 0, corresponding to on or off in electronic circuits), with equal probability for each state, the transmission of either state as the code for a message would convey one bit of information. The theory of information opened up by this conceptual analysis has become the basis for constructing and analyzing digital computational devices and a whole range of information technologies (i.e., technologies including telecommunications and data processing), from telephones to computer networks.

21. Internet

The Internet is a global computer network of networks whose origins are found in U.S. military efforts. In response to Sputnik and the emerging space race, the Advanced Research Projects Agency (ARPA) was formed in 1958 as an agency of the Pentagon. The researchers at ARPA were given a generous mandate to develop innovative technologies such as communications.

In 1962, psychologist J.C.R. Licklider from the Massachusetts Institute of Technology’s Lincoln Laboratory joined ARPA to take charge of the Information Processing Techniques Office (IPTO). In 1963 Licklider wrote a memo proposing an interactive network allowing people to communicate via computer. This project did not materialize. In 1966, Bob Taylor, then head of the IPTO, noted that he needed three different computer terminals to connect to three different machines in different locations around the nation. Taylor also recognized that universities working with IPTO needed more computing resources. Instead of the government buying machines for each university, why not share machines? Taylor revitalized Licklider’s idea, securing $1 million in funding, and hired 29-yearold Larry Roberts to direct the creation of ARPAnet.

In 1974, Robert Kahn and Vincent Cerf proposed the first internet-working protocol, a way for datagrams (packets) to be communicated between disparate networks, and they called it an ‘‘internet.’’ Their efforts created transmission control protocol/internet protocol (TCP/IP). In 1982, TCP/IP replaced NCP on ARPAnet. Other networks adopted TCP/IP and it became the dominant standard for all networking by the late 1990s.

In 1981 the U.S. National Science Foundation (NSF) created Computer Science Network (CSNET) to provide universities that did not have access to ARPAnet with their own network. In 1986, the NSF sponsored the NSFNET ‘‘backbone’’ to connect five supercomputing centers. The backbone also connected ARPAnet and CSNET together, and the idea of a network of networks became firmly entrenched. The open technical architecture of the Internet allowed numerous innovations to be grafted easily onto the whole. When ARPAnet was dismantled in 1990, the Internet was thriving at universities and technology- oriented companies. The NSF backbone was dismantled in 1995 when the NSF realized that commercial entities could keep the Internet running and growing on their own, without government subsidy. Commercial network providers worked through the Commercial Internet Exchange to manage network traffic.

22. Mainframe Computers

The term ‘‘computer’’ currently refers to a general-purpose, digital, electronic, stored-program calculating machine. The term ‘‘mainframe’’ refers to a large, expensive, multiuser computer, able to handle a wide range of applications. The term was derived from the main frame or cabinet in which the central processing unit (CPU) and main memory of a computer were kept separate from those cabinets that held peripheral devices used for input and output.

Computers are generally classified as supercomputers, mainframes, minicomputers, or microcomputers. This classification is based on factors such as processing capability, cost, and applications, with supercomputers the fastest and most expensive. All computers were called mainframes until the 1960s, including the first supercomputer, the naval ordnance research calculator (NORC), offered by International Business Machines (IBM) in 1954. In 1960, Digital Equipment Corporation (DEC) shipped the PDP-1, a computer that was much smaller and cheaper than a mainframe.

Mainframes once each filled a large room, cost millions of dollars, and needed a full maintenance staff, partly in order to repair the damage caused by the heat generated by their vacuum tubes. These machines were characterized by proprietary operating systems and connections through dumb terminals that had no local processing capabilities. As personal computers developed and began to approach mainframes in speed and processing power, however, mainframes have evolved to support a client/server relationship, and to interconnect with open standard-based systems. They have become particularly useful for systems that require reliability, security, and centralized control. Their ability to process large amounts of data quickly make them particularly valuable for storage area networks (SANs). Mainframes today contain multiple CPUs, providing additional speed through multiprocessing operations. They support many hundreds of simultaneously executing programs, as well as numerous input and output processors for multiplexing devices, such as video display terminals and disk drives. Many legacy systems, large applications that have been developed, tested, and used over time, are still running on mainframes.

23. Mineral Prospecting

Twentieth century mineral prospecting draws upon the accumulated knowledge of previous exploration and mining activities, advancing technology, expanding knowledge of geologic processes and deposit models, and mining and processing capabilities to determine where and how to look for minerals of interest. Geologic models have been developed for a wide variety of deposit types; the prospector compares geologic characteristics of potential exploration areas with those of deposit models to determine which areas have similar characteristics and are suitable prospecting locations. Mineral prospecting programs are often team efforts, integrating general and site-specific knowledge of geochemistry, geology, geophysics, and remote sensing to ‘‘discover’’ hidden mineral deposits and ‘‘measure’’ their economic potential with increasing accuracy and reduced environmental disturbance. Once a likely target zone has been identified, multiple exploration tools are used in a coordinated program to characterize the deposit and its economic potential.

24. Packet Switching

Historically the first communications networks were telegraphic—the electrical telegraph replacing the mechanical semaphore stations in the mid-nineteenth century. Telegraph networks were largely eclipsed by the advent of the voice (telephone) network, which first appeared in the late nineteenth century, and provided the immediacy of voice conversation. The Public Switched Telephone Network allows a subscriber to dial a connection to another subscriber, with the connection being a series of telephone lines connected together through switches at the telephone exchanges along the route. This technique is known as circuit switching, as a circuit is set up between the subscribers, and is held until the call is cleared.

One of the disadvantages of circuit switching is the fact that the capacity of the link is often significantly underused due to silences in the conversation, but the spare capacity cannot be shared with other traffic. Another disadvantage is the time it takes to establish the connection before the conversation can begin. One could liken this to sending a railway engine from London to Edinburgh to set the points before returning to pick up the carriages. What is required is a compromise between the immediacy of conversation on an established circuit-switched connection, with the ad hoc delivery of a store-and-forward message system. This is what packet switching is designed to provide.

25. Personal Computers

A personal computer, or PC, is designed for personal use. Its central processing unit (CPU) runs single-user systems and application software, processes input from the user, sending output to a variety of peripheral devices. Programs and data are stored in memory and attached storage devices. Personal computers are generally single-user desktop machines, but the term has been applied to any computer that ‘‘stands alone’’ for a single user, including portable computers.

The technology that enabled the construction of personal computers was the microprocessor, a programmable integrated circuit (or ‘‘chip’’) that acts as the CPU. Intel introduced the first microprocessor in 1971, the 4-bit 4004, which it called a ‘‘microprogrammable computer on a chip.’’ The 4004 was originally developed as a general-purpose chip for a programmable calculator, but Intel introduced it as part of Intel’s Microcomputer System 4-bit, or MCS-4, which also included read-only memory (ROM) and random-access memory (RAM) memory chips and a shift register chip. In August 1972, Intel followed with the 8-bit 8008, then the more powerful 8080 in June 1974. Following Intel’s lead, computers based on the 8080 were usually called microcomputers.

The success of the minicomputer during the 1960s prepared computer engineers and users for ‘‘single person, single CPU’’ computers. Digital Equipment Corporation’s (DEC) widely used PDP-10, for example, was smaller, cheaper, and more accessible than large mainframe computers. Timeshared computers operating under operating systems such as TOPS-10 on the PDP-10— co-developed by the Massachusetts Institute of Technology (MIT) and DEC in 1972—created the illusion of individual control of computing power by providing rapid access to personal programs and files. By the early 1970s, the accessibility of minicomputers, advances in microelectronics, and component miniaturization created expectations of affordable personal computers.

26. Printers

Printers generally can be categorized as either impact or nonimpact. Like typewriters, impact printers generate output by striking the page with a solid substance. Impact printers include daisy wheel and dot matrix printers. The daisy wheel printer, which was introduced in 1972 by Diablo Systems, operates by spinning the daisy wheel to the correct character whereupon a hammer strikes it, forcing the character through an inked ribbon and onto the paper. Dot matrix printers operate by using a series of small pins to strike a matrix or grid ribbon coated with ink. The strike of the pin forces the ink to transfer to the paper at the point of impact. Unlike daisy wheel printers, dot matrix printers can generate italic and other character types through producing different pin patterns. Nonimpact printers generate images by spraying or fusing ink to paper or other output media. This category includes inkjet printers, laser printers, and thermal printers. Whether they are inkjet or laser, impact or nonimpact, all modern printers incorporate features of dot matrix technology in their design: they operate by generating dots onto paper or other physical media.

27. Processors for Computers

A processor is the part of the computer system that manipulates the data. The first computer processors of the late 1940s and early 1950s performed three main functions and had three main components. They worked in a cycle to gather, decode, and execute instructions. They were made up of the arithmetic and logic unit, the control unit, and some extra storage components or registers. Today, most processors contain these components and perform these same functions, but since the 1960s they have developed different forms, capabilities, and organization. As with computers in general, increasing speed and decreasing size has marked their development.

28. Radionavigation

Astronomical and dead-reckoning techniques furnished the methods of navigating ships until the twentieth century, when exploitation of radio waves, coupled with electronics, met the needs of aircraft with their fast speeds, but also transformed all navigational techniques. The application of radio to dead reckoning has allowed vessels to determine their positions in all-weather by direction finding (known as radio direction finding, or RDF) or by hyperbolic systems. Another use of radio, radar (radio direction and rangefinding), enables vessels to determine their distance to, or their bearing from, objects of known position. Radionavigation complements traditional navigational methods by employing three frames of reference. First, radio enables a vessel to navigate by lines of bearing to shore transmitters (the most common use of radio). This is directly analogous to the use of lighthouses for bearings. Second, shore stations may take radio bearings of craft and relay to them computed positions. Third, radio beacons provide aircraft or ships with signals that function as true compasses.

29. Software Application Programs

At the beginning of the computer age around the late 1940s, inventors of the intelligent machine were not thinking about applications software, or any software other than that needed to run the bare machine to do mathematical calculating. It was only when Maurice Wilkes’ young protégé David Williams crafted a tidy set of initial orders for the EDSAC, an early programmable digital computer, that users could string together standard subroutines to a program and have the execution jump between them. This was the beginning of software as we know it—something that runs on a machine other than an operating system to make it do anything desired. ‘‘Applications’’ are software other than system programs that run the actual hardware. Manufacturers always had this software, and as the 1950s progressed they would ‘‘bundle’’ applications with hardware to make expensive computers more attractive. Some programming departments were even placed in the marketing departments.

30. Software Engineering

Software engineering aims to develop the programs that allow digital computers to do useful work in a systematic, disciplined manner that produces high-quality software on time and on budget. As computers have spread throughout industrialized societies, software has become a multibillion dollar industry. Both the users and developers of software depend a great deal on the effectiveness of the development process.

Software is a concept that didn’t even pertain to the first electronic digital computers. They were ‘‘programmed’’ through switches and patch cables that physically altered the electrical pathways of the machine. It was not until the Manchester Mark I, the first operational stored-program electronic digital computer, was developed in 1948 at the University of Manchester in England that configuring the machine to solve a specific problem became a matter of software rather than hardware. Subsequently, instructions were stored in memory along with data.

31. Supercomputers

Supercomputers are high-performance computing devices that are generally used for numerical calculation, for the study of physical systems either through numerical simulation or the processing of scientific data. Initially, they were large, expensive, mainframe computers, which were usually owned by government research labs. By the end of the twentieth century, they were more often networks of inexpensive small computers. The common element of all of these machines was their ability to perform high-speed floating-point arithmetic— binary arithmetic that approximates decimal numbers with a fixed number of bits—the basis of numerical computation.

With the advent of inexpensive supercomputers, these machines moved beyond the large government labs and into smaller research and engineering facilities. Some were used for the study of social science. A few were employed by business concerns, such as stock brokerages or graphic designers.

32. Systems Programs

The operating systems used in all computers today are a result of the development and organization of early systems programs designed to control and regulate the operations of computer hardware. The early computing machines such as the ENIAC of 1945 were ‘‘programmed’’ manually with connecting cables and setting switches for each new calculation. With the advent of the stored program computer of the late 1940s (the Manchester Mark I, EDVAC, EDSAC (electronic delay storage automatic calculator), the first system programs such as assemblers and compilers were developed and installed. These programs performed oft repeated and basic operations for computer use including converting programs into machine code, storing and retrieving files, managing computer resources and peripherals, and aiding in the compilation of new programs. With the advent of programming languages, and the dissemination of more computers in research centers, universities, and businesses during the late 1950s and 1960s, a large group of users began developing programs, improving usability, and organizing system programs into operating systems.

The 1970s and 1980s saw a turn away from some of the complications of system software, an interweaving of features from different operating systems, and the development of systems programs for the personal computer. In the early 1970s, two programmers from Bell Laboratories, Ken Thompson and Dennis Ritchie, developed a smaller, simpler operating system called UNIX. Unlike past system software, UNIX was portable and could be run on different computer systems. Due in part to low licensing fees and simplicity of design, UNIX increased in popularity throughout the 1970s. At the Xerox Palo Alto Research Center, research during the 1970s led to the development of system software for the Apple Macintosh computer that included a GUI (graphical user interface). This type of system software filtered the user’s interaction with the computer through the use of graphics or icons representing computer processes. In 1985, a year after the release of the Apple Macintosh computer, a GUI was overlaid on Microsoft’s then dominant operating system, MS-DOS, to produce Microsoft Windows. The Microsoft Windows series of operating systems became and remains the dominant operating system on personal computers.

33. World Wide Web

The World Wide Web (Web) is a ‘‘finite but unbounded’’ collection of media-rich digital resources that are connected through high-speed digital networks. It relies upon an Internet protocol suite that supports cross-platform transmission and makes available a wide variety of media types (i.e., multimedia). The cross-platform delivery environment represents an important departure from more traditional network communications protocols such as e-mail, telnet, and file transfer protocols (FTP) because it is content-centric. It is also to be distinguished from earlier document acquisition systems such as Gopher, which was designed in 1991, originally as a mainframe program but quickly implemented over networks, and wide area information systems (WAIS), also released in 1991. WAIS accommodated a narrower range of media formats and failed to include hyperlinks within their navigation protocols. Following the success of Gopher on the Internet, the Web quickly extended and enriched the metaphor of integrated browsing and navigation. This made it possible to navigate and peruse a wide variety of media types effortlessly on the Web, which in turn led to the Web’s hegemony as an Internet protocol.

History of Computer Technology

Computer Technology

The modern computer—the (electronic) digital computer in which the stored program concept is realized and hence self-modifying programs are possible—was only invented in the 1940s. Nevertheless, the history of computing (interpreted as the usage of modern computers) is only understandable against the background of the many forms of information processing as well as mechanical computing devices that solved mathematical problems in the first half of the twentieth century. The part these several predecessors played in the invention and early history of the computer may be interpreted from two different perspectives: on the one hand it can be argued that these machines prepared the way for the modern digital computer, on the other hand it can be argued that the computer, which was invented as a mathematical instrument, was reconstructed to be a data-processing machine, a control mechanism, and a communication tool.

The invention and early history of the digital computer has its roots in two different kinds of developments: first, information processing in business and government bureaucracies; and second, the use and the search for mathematical instruments and methods that could solve mathematical problems arising in the sciences and in engineering.

Origins in Mechanical Office Equipment

The development of information processing in business and government bureaucracies had its origins in the late nineteenth century, which was not just an era of industrialization and mass production but also a time of continuous growth in administrative work. The economic precondition for this development was the creation of a global economy, which caused growth in production of goods and trade. This brought with it an immense increase in correspondence, as well as monitoring and accounting activities—corporate bureaucracies began to collect and process data in increasing quantities. Almost at the same time, government organizations became more and more interested in collating data on population and demographic changes (e.g., expanding tax revenues, social security, and wide-ranging planning and monitoring functions) and analyzing this data statistically.

Bureaucracies in the U.S. and in Europe reacted in a different way to these changes. While in Europe for the most part neither office machines nor telephones entered offices until 1900, in the U.S. in the last quarter of the nineteenth century the information-handling techniques in bureaucracies were radically changed because of the introduction of mechanical devices for writing, copying, and counting data. The rise of big business in the U.S. had caused a growing demand for management control tools, which was fulfilled by a new ideology of systematic management together with the products of the rising office machines industry. Because of a later start in industrialization, the government and businesses in the U.S. were not forced to reorganize their bureaucracies when they introduced office machines. This, together with an ideological preference for modern office equipment, was the cause of a market for office machines and of a far-reaching mechanization of office work in the U.S. In the 1880s typewriters and cash registers became very widespread, followed by adding machines and book-keeping machines in the 1890s. From 1880 onward, the makers of office machines in the U.S. underwent a period of enormous growth, and in 1920 the office machine industry annually generated about $200 million in revenue. In Europe, by comparison, mechanization of office work emerged about two decades later than in the U.S.—both Germany and Britain adopted the American system of office organization and extensive use of office machines for the most part no earlier than the 1920s.

During the same period the rise of a new office machine technology began. Punched card systems, initially invented by Herman Hollerith to analyze the U.S. census in 1890, were introduced. By 1911 Hollerith’s company had only about 100 customers, but after it had been merged in the same year with two other companies to become the Computing- Tabulating-Recording Company (CTR), it began a tremendous ascent to become the world leader in the office machine industry. CTR’s general manager, Thomas J. Watson, understood the extraordinary potential of these punched-card accounting devices, which enabled their users to process enormous amounts of data largely automatically, in a rapid way and at an adequate level of cost and effort. Due to Watson’s insights and his extraordinary management abilities, the company (which had since been renamed to International Business Machines (IBM)) became the fourth largest office machine supplier in the world by 1928—topped only by Remington Rand, National Cash Register (NCR), and the Burroughs Adding Machine Company.

Origin of Calculating Devices and Analog Instruments

Compared with the fundamental changes in the world of corporate and government bureaucracies caused by office machinery during the late nineteenth and early twentieth century, calculating machines and instruments seemed to have only a minor influence in the world of science and engineering. Scientists and engineers had always been confronted with mathematical problems and had over the centuries developed techniques such as mathematical tables. However, many new mathematical instruments emerged in the nineteenth century and increasingly began to change the world of science and engineering. Apart from the slide rule, which came into popular use in Europe from the early nineteenth century onwards (and became the symbol of the engineer for decades), calculating machines and instruments were only produced on a large scale in the middle of the nineteenth century.

In the 1850s the production of calculating machines as well as that of planimeters (used to measure the area of closed curves, a typical problem in land surveying) started on different scales. Worldwide, less than 2,000 calculating machines were produced before 1880, but more than 10,000 planimeters were produced by the early 1880s. Also, various types of specialized mathematical analog instruments were produced on a very small scale in the late nineteenth century; among them were integraphs for the graphical solution of special types of differential equations, harmonic analyzers for the determination of Fourier coefficients of a periodic function, and tide predictors that could calculate the time and height of the ebb and flood tides.

Nonetheless, in 1900 only geodesists and astronomers (as well as part of the engineering community) made extensive use of mathematical instruments. In addition, the establishment of applied mathematics as a new discipline took place at German universities on a small scale and the use of apparatus and machines as well as graphical and numerical methods began to flourish during this time. After World War I, the development of engineering sciences and of technical physics gave a tremendous boost to applied mathematics in Germany and Britain. In general, scientists and engineers became more aware of the capabilities of calculating machines and a change of the calculating culture—from the use of tables to the use of calculating machines—took place.

One particular problem that was increasingly encountered by mechanical and electrical engineers in the 1920s was the solution of several types of differential equations, which were not solvable by analytic solutions. As one important result of this development, a new type of analog instrument— the so called ‘‘differential analyzer’’—was invented in 1931 by the engineer Vannevar Bush at the Massachusetts Institute of Technology (MIT). In contrast to its predecessors—several types of integraphs—this machine (which was later called an analog computer) could be used not only to solve a special class of differential equation, but a more general class of differential equations associated with engineering problems. Before the digital computer was invented in the 1940s there was an intensive use of analog instruments (similar to Bush’s differential analyzer) and a number of machines were constructed in the U.S. and in Europe after the model of Bush’s machine before and during World War II. Analog instruments also became increasingly important in several fields such as the firing control of artillery on warships or the control of rockets. It is worth mentioning here that only for a limited class of scientific and engineering problems was it possible to construct an analog computer— weather forecasting and the problem of shock waves produced by an atomic bomb, for example, required the solution of partial differential equations, for which a digital computer was needed.

The Invention of the Computer

The invention of the electronic digital stored-program computer is directly connected with the development of numerical calculation tools for the solution of mathematical problems in the sciences and in engineering. The ideas that led to the invention of the computer were developed simultaneously by scientists and engineers in Germany, Britain, and the U.S. in the 1930s and 1940s. The first freely programmable program-controlled automatic calculator was developed by the civil engineering student Konrad Zuse in Germany. Zuse started development work on program-controlled computing machines in the 1930s, when he had to deal with extensive calculations in static, and in 1941 his Z3, which was based on electromechanical relay technology, became operational.

Several similar developments in the U.S. were in progress at the same time. In 1937 Howard Aiken, a physics student at Harvard University, approached IBM to build a program-controlled calculator— later called the ‘‘Harvard Mark I.’’ On the basis of a concept Aiken had developed because of his experiences with the numerical solution of partial differential equations, the machine was built and became operational in 1944. At almost the same time a series of important relay computers was built at the Bell Laboratories in New York following a suggestion by George R. Stibitz. All these developments in the U.S. were spurred by the outbreak of World War II. The first large-scale programmable electronic computer called the Colossus was built in complete secrecy in 1943 to 1944 at Bletchley Park in Britain in order to help break the German Enigma machine ciphers.

However, it was neither these relay calculators nor the Colossus that were decisive for the development of the universal computer, but the ENIAC (electronic numerical integrator and computer), which was developed at the Moore School of Engineering at the University of Pennsylvania. Extensive ballistic calculations were carried out there for the U.S. Army during World War II with the aid of the Bush ‘‘differential analyzer’’ and more than a hundred women (‘‘computors’’) working on mechanical desk calculators. Observing that capacity was barely sufficient to compute the artillery firing tables, the physicist John W. Mauchly and the electronic engineer John Presper Eckert started developing the ENIAC, a digital version of the differential analyzer, in 1943 with funding from the U.S. Army.

In 1944 the mathematician John von Neumann turned his attention to the ENIAC because of his mathematical work on the Manhattan Project (on the implosion of the hydrogen bomb). While the ENIAC was being built, Neumann and the ENIAC team drew up plans for a successor to the ENIAC in order to improve the shortcomings of the ENIAC concept, such as the very small memory and the time-consuming reprogramming (actually rewiring) required to change the setup for a new calculation. In these meetings the idea of a stored-program, universal machine evolved. Memory was to be used to store the program in addition to data. This would enable the machine to execute conditional branches and change the flow of the program. The concept of a computer in the modern sense of the word was born and in 1945 von Neumann wrote the important ‘‘First Draft of a Report on the EDVAC,’’ which described the stored-program, universal computer. The logical structure that was presented in this draft report is now referred to as the ‘‘von Neumann architecture.’’ This EDVAC report was originally intended for internal use but once made freely available it became the ‘‘bible’’ for computer pioneers throughout the world in the 1940s and 1950s. The first computer featuring the von Neumann architecture operated at Cambridge University in the U.K.; in June 1949 the EDSAC (electronic delay storage automatic computer) computer built by Maurice Wilkes—designed according to the EDVAC principles—became operational.

The Computer as a Scientific Instrument

As soon as the computer was invented, a growing demand for computers by scientists and engineers evolved, and numerous American and European universities started their own computer projects in the 1940s and 1950s. After the technical difficulties of building an electronic computer were solved, scientists grasped the opportunity to use the new scientific instrument for their research. For example, at the University of Gottingen in Germany, the early computers were used for the initial value problems of partial differential equations associated with hydrodynamic problems from atomic physics and aerodynamics. Another striking example was the application of von Neumann’s computer at the Institute for Advanced Study (IAS) in Princeton to numerical weather forecasts in 1950. As a result, numerical weather forecasts could be made on a regular basis from the mid-1950s onwards.

Mathematical methods have always been of a certain importance for science and engineering sciences, but only the use of the electronic digital computer (as an enabling technology) made it possible to broaden the application of mathematical methods to such a degree that research in science, medicine, and engineering without computer- based mathematical methods has become virtually inconceivable at the end of the twentieth century. A number of additional computer-based techniques, such as scientific visualization, medical imaging, computerized tomography, pattern recognition, image processing, and statistical applications, have become of the utmost significance for science, medicine, engineering, and social sciences. In addition, the computer changed the way engineers construct technical artifacts fundamentally because of the use of computer-based methods such as computer-aided design (CAD), computer-aided manufacture (CAM), computer-aided engineering, control applications, and finite-element methods. However, the most striking example seems to be the development of scientific computing and computer modeling, which became accepted as a third mode of scientific research that complements experimentation and theoretical analysis. Scientific computing and computer modeling are based on supercomputers as the enabling technology, which became important tools for modern science routinely used to simulate physical and chemical phenomena. These high-speed computers became equated with the machines developed by Seymour Cray, who built the fastest computers in the world for many years. The supercomputers he launched such as the legendary CRAY I from 1976 were the basis for computer modeling of real world systems, and helped, for example, the defense industry in the U.S. to build weapons systems and the oil industry to create geological models that show potential oil deposits.

Growth of Digital Computers in Business and Information Processing

When the digital computer was invented as a mathematical instrument in the 1940s, it could not have been foreseen that this new artifact would ever be of a certain importance in the business world. About 50 firms entered the computer business worldwide in the late 1940s and the early 1950s, and the computer was reconstructed to be a type of electronic data-processing machine that took the place of punched-card technology as well as other office machine technology. It is interesting to consider that there were mainly three types of companies building computers in the 1950s and 1960s: newly created computer firms (such as the company founded by the ENIAC inventors Eckert and Mauchly), electronics and control equipments firms (such as RCA and General Electric), and office appliance companies (such as Burroughs and NCR). Despite the fact that the first digital computers were put on the market by a German and a British company, U.S. firms dominated the world market from the 1950s onward, as these firms had the biggest market as well as financial support from the government.

Generally speaking, the Cold War exerted an enormous influence on the development of computer technology. Until the early 1960s the U.S. military and the defense industry were the central drivers of the digital computer expansion, serving as the main market for computer technology and shaping and speeding up the formation of the rising computer industry. Because of the U.S. military’s role as the ‘‘tester’’ for prototype hard- and software, it had a direct and lasting influence on technological developments; in addition, it has to be noted that the spread of computer technology was partly hindered by military secrecy. Even after the emergence of a large civilian computer market in the 1960s, the U.S. military maintained its influence by investing a great deal in computer in hard- and software and in computer research projects.

From the middle of the 1950s onwards the world computer market was dominated by IBM, which accounted for more than 70 percent of the computer industry revenues until the mid-1970s. The reasons for IBM’s overwhelming success were diverse, but the company had a unique combination of technical and organizational capabilities at its disposal that prepared it perfectly for the mainframe computer market. In addition, IBM benefited from enormous government contracts, which helped to develop excellence in computer technology and design. However, the greatest advantage of IBM was by no doubt its marketing organization and its reputation as a service-oriented firm, which was used to working closely with customers to adapt machinery to address specific problems, and this key difference between IBM and its competitors persisted right into the computer age.

During the late 1950s and early 1960s, the computer market—consisting of IBM and seven other companies called the ‘‘seven dwarves’’—was dominated by IBM, with its 650 and 1401 computers. By 1960 the market for computers was still small. Only about 7,000 computers had been delivered by the computer industry, and at this time even IBM was primarily a punched-card machine supplier, which was still the major source of its income. Only in 1960 did a boom in demand for computers start, and by 1970 the number of computers installed worldwide had increased to more than 100,000. The computer industry was on the track to become one of the world’s major industries, and was totally dominated by IBM.

The outstanding computer system of this period was IBM’s System/360. It was announced in 1964 as a compatible family of the same computer architecture, and employed interchangeable peripheral devices in order to solve IBM’s problems with a hotchpotch of incompatible product lines (which had evoked large problems in the development and maintenance of a great deal of different hardware and software products). Despite the fact that neither the technology used nor the systems programming were of a high-tech technology at the time, the System/360 established a new standard for mainframe computers for decades. Various computer firms in the U.S., Europe, Japan and even Russia, concentrated on copying components, peripherals for System/360 or tried to build System/360-compatible computers.

The growth of the computer market during the 1960s was accompanied by market shakeouts: two of the ‘‘seven dwarves’’ left the computer business after the first computer recession in the early 1970s, and afterwards the computer market was controlled by IBM and BUNCH (Burroughs, UNIVAC, NCR, Control Data, and Honeywell). At the same time, an internationalization of the computer market took place—U.S. companies controlled the world market for computers— which caused considerable fears over loss of national independence in European and Japanese national governments, and these subsequently stirred up national computing programs. While the European attempts to create national champions as well as the more general attempt to create a European-wide market for mainframe computers failed in the end, Japan’s attempt to found a national computer industry has been successful: Until today Japan is the only nation able to compete with the U.S. in a wide array of high-tech computer-related products.

Real-Time and Time-Sharing

Until the 1960s almost all computers in government and business were running batch-processing applications (i.e., the computers were only used in the same way as the punched-card accounting machines they had replaced). In the early 1950s, however, the computer industry introduced a new mode of computing named ‘‘real-time’’ in the business sector for the first time, which was originally developed for military purposes in MIT’s Whirlwind project. This project was initially started in World War II with the aim of designing an aircraft simulator by analog methods, and later became a part of a research and development program for the gigantic, computerized anti-aircraft defense system SAGE (semi-automatic ground environment) built up by IBM in the 1950s.

The demand for this new mode of computing was created by cultural and structural changes in economy. The increasing number of financial transactions in banks and insurance companies as well as increasing airline traveling activities made necessary new computer-based information systems that led finally to new forms of business evolution through information technology.

The case of the first computerized airline reservation system SABRE, developed for American Airlines by IBM in the 1950s and finally implemented in the early 1960s, serves to thoroughly illustrate these structural and structural changes in economy. Until the early 1950s, airline reservations had been made manually without any problems, but by 1953 this system was in crisis because increased air traffic and growing flight plan complexity had made reservation costs insupportable. SABRE became a complete success, demonstrating the potential of centralized real-time computing systems connected via a network. The system enabled flight agents throughout the U.S., who were equipped with desktop terminals, to gain a direct, real-time access to the central reservation system based on central IBM mainframe computers, while the airline was able to assign appropriate resources in response. Therefore, an effective combination of advantages was offered by SABRE—a better utilization of resources and a much higher customer convenience.

Very soon this new mode of computing spread around the business and government world and became commonplace throughout the service and distribution sectors of the economy; for example, bank tellers and insurance account representatives increasingly worked at terminals. On the one hand structural information problems led managers to go this way, and on the other hand the increasing use of computers as information handling machines in government and business had brought about the idea of computer-based accessible data retrieval. In the end, more and more IBM customers wanted to link dozens of operators directly to central computers by using terminal keyboards and display screens.

In the late 1950s and early 1960s—at the same time that IBM and American Airlines had begun the development of the SABRE airline reservation system—a group of brilliant computer scientists had a new idea for computer usage named ‘‘time sharing.’’ Instead of dedicating a multi-terminal system solely to a single application, they had the computer utility vision of organizing a mainframe computer so that several users could interact with it simultaneously. This vision was to change the nature of computing profoundly, because computing was no longer provided to naive users by programmers and systems analysts, and by the late 1960s time-sharing computers became widespread in the U.S.

Particularly important for this development had been the work of J.C.R. Licklider of the Advanced Research Project Agency (ARPA) of the U.S. Department of Defense. In 1960 Licklider had published a now-classic paper ‘‘Man–Computer Symbiosis’’ proposing the use of computers to augment human intellect and creating the vision of interactive computing. Licklider was very successful in translating his idea of a network allowing people on different computers to communicate into action, and convinced ARPA to start an enormous research program in 1962. Its budget surpassed that of all other sources of U.S. public research funding for computers combined. The ARPA research programs resulted in a series of fundamental moves forward in computer technology in areas such as computer graphics, artificial intelligence, and operating systems. For example, even the most influential current operating system, the general-purpose time-sharing system Unix, developed in the early 1970s at the Bell Laboratories, was a spin-off of an ambitious operating system project, Multics, funded by ARPA. The designers of Unix successfully attempted to keep away from complexity by using a clear, minimalist design approach to software design, and created a multitasking, multiuser operating system, which became the standard operating system in the 1980s.

Electronic Component Revolution

While the nature of business computing was changed by the new paradigms such as real time and time sharing, advances in solid-state components increasingly became a driving force for fundamental changes in the computer industry, and led to a dynamic interplay between new computer designs and new programming techniques that resulted in a remarkable series of technical developments. The technical progress of the mainframe computer had always run parallel to conversions in the electronics components. During the period from 1945 to 1965, two fundamental transformations in the electronics industry took place that were marked by the invention of the transistor in 1947 and the integrated circuit in 1957 to 1958. While the first generation of computers—lasting until about 1960—was characterized by vacuum tubes (valves) for switching elements, the second generation used the much smaller and more reliable transistors, which could be produced at a lower price. A new phase was inaugurated when an entire integrated circuit on a chip of silicon was produced in 1961, and when the first integrated circuits were produced for the military in 1962. A remarkable pace of progress in semiconductor innovations, known as the ‘‘revolution in miniature,’’ began to speed up the computer industry. The third generation of computers characterized by the use of integrated circuits began with the announcement of the IBM System/360 in 1964 (although this computer system did not use true integrated circuits). The most important effect of the introduction of integrated circuits was not to strengthen the leading mainframe computer systems, but to destroy Grosch’s Law, which stated that computing power increases as the square of its costs. In fact, the cost of computer power dramatically reduced during the next ten years.

This became clear with the introduction of the first computer to use integrated circuits on a full scale in 1965: the Digital Equipment Corporation (DEC) offered its PDP-8 computer for just $18,000, creating a new class of computers called minicomputers—small in size and low in cost—as well as opening up the market to new customers. Minicomputers were mainly used in areas other than general-purpose computing such as industrial applications and interactive graphics systems. The PDP-8 became the first widely successful minicomputer with over 50,000 items sold, demonstrating that there was a market for smaller computers. This success of DEC (by 1970 it had become the world’s third largest computer manufacturer) was supported by dramatic advances in solid-state technology. During the 1960s the number of transistors on a chip doubled every two years, and as a result minicomputers became continuously more powerful and more inexpensive at an inconceivable speed.

Personal Computing

The most striking aspect of the consequences of the exponential increase of the number of transistors on a chip during the 1960s—as stated by ‘‘Moore’s Law’’: the number of transistors on a chip doubled every two years—was not the lowering of the costs of mainframe computer and minicomputer processing and storage, but the introduction of the first consumer products based on chip technology such as hand-held calculators and digital watches in about 1970. More specifically, the market acts in these industries were changed overnight by the shift from mechanical to chip technology, which led to an enormous deterioration in prices as well as a dramatic industry shakeout. These episodes only marked the beginning of wide-ranging changes in economy and society during the last quarter of the twentieth century leading to a new situation where chips played an essential role in almost every part of business and modern life.

The case of the invention of the personal computer serves to illustrate that it was not sufficient to develop the microprocessor as the enabling technology in order to create a new invention, but how much new technologies can be socially constructed by cultural factors and commercial interests. When the microprocessor, a single-chip integrated circuit implementation of a CPU, was launched by the semiconductor company Intel in 1971, there was no hindrance to producing a reasonably priced microcomputer, but it took six years until the consumer product PC emerged. None of the traditional mainframe and minicomputer companies were involved in creating the early personal computer. Instead, a group of computer hobbyists as well as the ‘‘computer liberation’’ movement in the U.S. became the driving force behind the invention of the PC. These two groups were desperately keen on a low-priced type of minicomputer for use at home for leisure activities such as computer games; or rather they had the counterculture vision of an unreservedly available and personal access to an inexpensive computer utility provided with rich information. When in 1975 the Altair 8800, an Intel 8080 microprocessor-based computer, was offered as an electronic hobbyist kit for less than $400, these two groups began to realize their vision of a ‘‘personal computer.’’ Very soon dozens of computer clubs and computer magazines were founded around the U.S., and these computer enthusiasts created the personal computer by combining the Altair with keyboards, disk drives, and monitors as well as by developing standard software for it. Consequently, in only two years, a more or less useless hobbyist kit had been changed into a computer that could easily be transformed in a consumer product.

The computer hobbyist period ended in 1977, when the first standard machines for an emerging consumer product mass market were sold. These included products such as the Commodore Pet and the Apple II, which included its own monitor, disk drive, and keyboard, and was provided with several basic software packages. Over next three years, spreadsheet, word processing, and database software were developed, and an immense market for games software evolved. As a result, personal computers became more and more a consumer product for ordinary people, and Apple’s revenues shot to more than $500 million in 1982. By 1980, the personal computer had transformed into a business machine, and IBM decided to develop its own personal computer, which was introduced as the IBM PC in 1981. It became an overwhelming success and set a new industry standard.

Apple tried to compete by launching their new Macintosh computer in 1984 provided with a revolutionary graphical user interface (GUI), which set a new standard for a user-friendly human–computer interaction. It was based on technology created by computer scientists at the Xerox Palo Alto Research Center in California, who had picked up on ideas about human– computer interaction developed at the Stanford Research Institute and at the University of Utah. Despite the fact that the Macintosh’s GUI was far superior to the MS-DOS operating system of the IBM-compatible PCs, Apple failed to win the business market and remained a niche player with a market share of about 10 percent. The PC main branch was determined by the companies IBM had chosen as its original suppliers in 1981 for the design of the microprocessor (Intel) and the operating system (Microsoft). While IBM failed to seize power in the operating system software market for PCs in a software war with Microsoft, Microsoft achieved dominance not only of the key market for PC operating systems, but also the key market of office applications during the first half of the 1990s.

In the early 1990s computing again underwent further fundamental changes with the appearance of the Internet, and for the most computer users, networking became an integral part of what it means to have a computer. Furthermore, the rise of the Internet indicated the impending arrival of a new ‘‘information infrastructure’’ as well as of a ‘‘digital convergence,’’ as the coupling of computers and communications networks was often called.

In addition, the 1990s were a period of an information technology boom, which was mainly based on the Internet hype. For many years previously, it seemed to a great deal of managers and journalists that the Internet would become not just an indispensable business tool, but also a miracle cure for economic growth and prosperity. In addition, computer scientists and sociologists started a discussion predicting the beginning of a new ‘‘information age’’ based on the Internet as a ‘‘technological revolution’’ and reshaping the ‘‘material basis’’ of industrial societies.

The Internet was the outcome of an unusual collaboration of a military–industrial–academic complex that promoted the development of this extraordinary innovation. It grew out of a military network called the ARPAnet, a project established and funded by ARPA in the 1960s. The ARPAnet was initially devoted to support of data communications for defense research projects and was only used by a small number of researchers in the 1970s. Its further development was primarily promoted by unintentional forms of network usage. The users of the ARPAnet became very much attracted by the opportunity for communicating through electronic mail, which rapidly surpassed all other forms of network activities. Another unplanned spin-off of the ARPAnet was the Usenet (Unix User Network), which started in 1979 as a link between two universities and enabled its users to subscribe to newsgroups. Electronic mail became a driving force for the creation of a large number of new proprietary networks funded by the existing computer services industry or by organizations such as the NSF (NSFnet). Because networks users’ desire for email to be able to cross network boundaries, an ARPA project on ‘‘internetworking’’ became the origin for the ‘‘Internet’’—a network of networks linked by several layers of protocols such as TCP/IP (transmission control protocol/internet protocol), which quickly developed into the actual standard.

Only after the government funding had solved many of the most essential technical issues and had shaped a number of the most characteristic features of the Internet, did private sector entrepreneurs start Internet-related ventures and quickly developed user-oriented enhancements. Nevertheless, the Internet did not make a promising start and it took more than ten years before significant numbers of networks were connected. In 1980, the Internet had less than two hundred hosts, and during the next four years the number of hosts went up only to 1000. Only when the Internet reached the educational and business community of PC users in the late 1980s, did it start to become an important economic and social phenomenon. The number of hosts began an explosive growth in the late 1980s—by 1988 there were over 50,000 hosts. An important and unforeseen side effect of this development became the creation of the Internet into a new electronic publishing medium. The electronic publishing development that excited most interest in the Internet was the World Wide Web, originally developed at the CERN High Energy Physics Laboratory in Geneva in 1989. Soon there were millions of documents on the Internet, and private PC users became excited by the joys of surfing the Internet. A number of firms such as AOL soon provided low-cost network access and a range of consumer-oriented information services. The Internet boom was also helped by the Clinton–Gore presidential election campaign on the ‘‘information superhighway’’ and by the amazing news reporting on the national information infrastructure in the early 1990s. Nevertheless, for many observers it was astounding how fast the number of hosts on the Internet increased during the next few years—from more than 1 million in 1992 to 72 million in 1999.

The overwhelming success of the PC and of the Internet tends to hide the fact that its arrival marked only a branching in computer history and not a sequence. (Take, for example, the case of mainframe computers, which still continue to run, being of great importance to government facilities and the private sector (such as banks and insurance companies), or the case of supercomputers, being of the utmost significance for modern science and engineering.) Furthermore, it should be noted that only a small part of the computer applications performed today is easily observable—98 percent of programmable CPUs are used in embedded systems such as automobiles, medical devices, washing machines and mobile telephones.

Browse other Technology Research Paper Topics .

ORDER HIGH QUALITY CUSTOM PAPER

computer engineering research paper topics

University Library, University of Illinois at Urbana-Champaign

University of Illinois Library Wordmark

Electrical and Computer Engineering Research Resources: Find Articles & Papers

  • Find Articles & Papers
  • High-Impact Journals
  • Standards & Technical Reports
  • Patents & Government Documents
  • E-Books & Reference
  • Dissertations & Theses
  • Additional Resources

Engineering Easy Search

University library search engines.

  • Grainger Engineering Library Homepage With specialized searches for Engineering and the Physical Sciences.
  • Easy Search The easiest way to locate University Library resources, materials, and more!
  • Find Online Journals Search by title or by subject to view our subscription details, including date ranges and where you can access full text.
  • Journal and Article Locator Finds electronic or print copy of articles by using a citation.

Engineering Article Databases

  • Engineering Village This link opens in a new window Search for articles, conference paper, and report information in all areas of engineering. Full-text is often available through direct download.
  • Scopus This link opens in a new window Search periodicals, conference proceedings, technical reports, trade literature, patents, books, and press releases in all engineering fields. Some full-text available as direct downloads.
  • Web of Science (Core Collection) This link opens in a new window Search for articles in science and engineering. Also provides Science Citation Index that tracks citations in science and technical journals published since 1981. Journal Citation Reports are also available through ISI.

Electrical & Computer Engineering

  • ACM Digital Library This site provides access to tables of contents, abstracts, reviews, and full text of every article ever published by ACM and bibliographic citations from major publishers in computing.
  • ENGnetBASE A collection of best-selling engineering handbooks and reference titles. Includes access to sub-collections: CivilENGINEERINGnetBASE, ElectricalENGINEERINGnetBASE, GeneralENGINEERINGnetBASE, IndustrialENGINEERINGnetBASE, MechanicalENGINEERINGnetBASE, MiningENGINEERINGnetBASE.
  • IEEE Xplore Provides full-text access to IEEE transactions, IEEE and IEE journals, magazines, and conference proceedings published since 1988, and all current IEEE standards; brings additional search and access features to IEEE/IEE digital library users. Browsable by books & e-books, conference publications, education and learning, journals and magazines, standards and by topic. Also provides links to IEEE standards, IEEE spectrum and other sites.
  • INSPEC Database providing access to bibliographic citations and abstracts of the scientific and technical literature in physics, electrical engineering, electronics, communications, control engineering, computers and computing, information technology, manufacturing and production engineering. Material covered includes journal articles, conference proceedings, reports, dissertations, patents and books published around the world.
  • Microelectronics Packaging Materials Database (MPMD) The MPMD database contains data and information on thermal, mechanical, electrical and physical properties of electronics packaging materials. Available in a Web-based format. The database is continually updated and expanded.
  • SPIE Digital Library Contains full-text papers on optics and photonics from SPIE journals and proceedings published since 1990. Approximately 15,000 new papers will be added each year.

Subject Guide

Profile Photo

Ask a Librarian

  • Next: High-Impact Journals >>
  • Last Updated: Jun 16, 2023 9:35 AM
  • URL: https://guides.library.illinois.edu/ece

EP-Logo-wit-text-260px

Engineer's Planet

Mtech, Btech Projects, PhD Thesis and Research Paper Writing Services in Delhi

Computer Network Project Topics With Abstracts and Base Papers 2024

Embark on a transformative journey into the world of computer networks with our carefully curated selection of M.Tech project topics for 2024, seamlessly paired with trending IEEE base papers. These projects encapsulate the cutting-edge developments and challenges in networking, providing an invaluable resource for M.Tech students eager to delve into the dynamic landscape of connectivity and communication. Our comprehensive compilation spans an array of Computer Network project topics, each thoughtfully accompanied by an associated base paper and a succinct abstract. From advanced routing algorithms and network security to emerging technologies like 5G and IoT, these projects represent the latest trends in the ever-evolving field of networking. Stay abreast of industry demands by exploring projects that align with the forefront of technological innovation. Whether you’re a student, researcher, or industry professional, our collection serves as a gateway to the forefront of networking advancements. The project titles are strategically chosen to incorporate keywords that resonate with the current trends in networking, ensuring alignment with the latest IEEE standards and technological breakthroughs. Delve into the abstracts to quickly grasp the scope, methodologies, and potential impacts of each project.

M.Tech Projects Topics List In Computer Network

Leave a Reply Cancel reply

This site uses Akismet to reduce spam. Learn how your comment data is processed .

  • Systems Ph.D.
  • M.Eng. Degree On Campus
  • M.Eng. Degree Distance Learning
  • Systems M.S. Degree
  • Minor in Systems Engineering
  • Professional Certificates
  • Student Organizations
  • Energy Systems M.Eng. Pathway
  • Systems M.Eng. Projects

Research Topics

  • Research News
  • Ezra's Round Table / Systems Seminar Series
  • Academic Leadership
  • Graduate Field Faculty
  • Graduate Students
  • Staff Directory
  • Ezra Systems Postdoctoral Associates
  • Research Associates
  • Faculty Openings-Systems
  • Get Involved
  • Giving Opportunities
  • Recruit Students
  • Systems Magazine
  • Academic Support
  • Experience and Employment
  • Graduate Services and Activities
  • Mental Health Resources
  • Recruitment Calendar
  • Tuition and Financial Aid
  • Program Description
  • Program Offerings
  • How to Apply
  • Ezra Postdoctoral Associate in Energy Systems Engineering
  • Cornell Systems Summit

Research in Systems Engineering at Cornell covers an extremely broad range of topics, because of this nature, the research takes on a collaborative approach with faculty from many different disciplines both in traditional engineering areas as well as those outside of engineering.

Because of the nature of systems science and engineering, the research takes on a collaborative approach with faculty and students from many different disciplines both in traditional engineering areas as well as those outside of engineering such as health care, food systems, environmental studies, architecture and regional planning, and many others.

Artificial Intelligence

Computational science and engineering, computer systems.

Data Mining

Earth and Atmospheric Science

Energy systems, health systems, heat and mass transfer.

Information Theory and Communication

Infrastructure Systems

Mechanics biological materials, natural hazards.

Programming Languages - CS

Remote Sensing

Robotics and autonomy, satellite systems, scientific computing, sensor and actuators, signal and image processing, space science and engineering, statistics and machine learning, statistical mechanics and molecular simulation, sustainable energy systems, systems and networking - cs, transportation systems engineering, water systems.

Algorithms

Oliver Gao | Civil and Environmental Engineering

David Goldberg | Operations Research and Information Engineering

Adrian Lewis |  Operations Research and Information Engineering

Linda Nozick |  Civil and Environmental Engineering

Francesca Parise | Electrical and Computer Engineering

Mason Peck | Mechanical and Aerospace Engineering

Patrick Reed |  Civil and Environmental Engineering

Samitha Samaranayake |  Civil and Environmental Engineering

Timothy Sands |  Mechanical and Aerospace Engineering

Huseyin Topaloglu |  Operations Research and Information Engineering

Fengqi You | Chemical and Biomolecular Engineering

infrastructure

Mark Campbell | Mechanical and Aerospace Engineering

Kirstin Petersen |  Electrical and Computer Engineering

Patrick Reed | Civil and Environmental Engineering

Computational Science and Engineering

Jose Martinez | Electrical and Computer Engineering

Data science

Data Science

Madeleine Udell | Operations Research and Information Engineering

Earth and atmospheric science

Maha Haji | Mechanical and Aerospace Engineering

Semida Silveira | Systems Engineering

Jery Stedinger |  Civil and Environmental Engineering

Jefferson Tester | Chemical and Biomolecular Engineering

Lang Tong | Electrical and Computer Engineering

Fengqi You |  Chemical and Biomolecular Engineering

Health systems

Shane Henderson | Operations Research and Information Engineering

John Muckstadt |  Operations Research and Information Engineering

Jamol Pender |  Operations Research and Information Engineering

Rana Zadeh |  Human Centered Design

Yiye Zhang |  Weill Cornell Medicine

Heat and mass transfer

Information Theory and Communications

Stephen Wicker | Electrical and Computer Engineering

Infrastructure Systems

Programming Languages – CS

Andrew Myers | Computer Science

Fred Schneider | Computer Science

Remote Sensing

Mason Pack | Mechanical and Aerospace Engineering

Robotics

Mark Campbell |  Mechanical and Aerospace Engineering

Robert Shepherd |  Mechanical and Aerospace Engineering

Satellite systems

Richardo Daziano | Civil and Environmental Engineering

Linda Nozick | Civil and Environmental Engineering

Bart Selman | Computer Science

Statistical Mechanics and Molecular Simulation

Timur Dogan | Arts Architecture and Planning

Systems and Networking - CS

Ken Birman | Computer Science

Hakim Weatherspoon | Computer Science

Transportation Systems Engineering

Richard Geddes | College of Human Ecology

Water systems

IMAGES

  1. Engineering Research Paper With Best Topics & Writing Help

    computer engineering research paper topics

  2. Research Paper Topics in Computer Science & Engineering

    computer engineering research paper topics

  3. PhD-Topics-in-Computer-Science-list.pdf

    computer engineering research paper topics

  4. (DOC) COMPUTER ENGINEERING PROJECT TOPICS AND MATERIAL

    computer engineering research paper topics

  5. Free Computer Engineering Project Topics For Final Year Students

    computer engineering research paper topics

  6. 200+ Best Engineering Research Paper Topics in 2022

    computer engineering research paper topics

VIDEO

  1. Automatically Generating Personalized Adaptive User Interfaces

  2. Most Important & Repeated Topics of UGC NET Computer Science Through Detail Analysis Of Last 3 Years

  3. 12 Mistakes PhD Students Make + a Bonus

  4. Latest Research Topics List For Civil Engineering

  5. PhD in Mechanical Engineering. Project topics interview with Ph.D. student

  6. From Rejected to Defended: Our Remarkable Thesis Journey

COMMENTS

  1. Computer Science

    Covers applications of computer science to the mathematical modeling of complex systems in the fields of science, engineering, and finance. Papers here are interdisciplinary and applications-oriented, focusing on techniques and tools that enable challenging computational simulations to be performed, for which the use of supercomputers or ...

  2. Computer Science Research Topics (+ Free Webinar)

    Finding and choosing a strong research topic is the critical first step when it comes to crafting a high-quality dissertation, thesis or research project. If you've landed on this post, chances are you're looking for a computer science-related research topic, but aren't sure where to start.Here, we'll explore a variety of CompSci & IT-related research ideas and topic thought-starters ...

  3. 66249 PDFs

    Explore the latest full-text research PDFs, articles, conference papers, preprints and more on COMPUTER ENGINEERING. Find methods information, sources, references or conduct a literature review on ...

  4. Computer Science and Engineering Theses and Dissertations

    Human-centered Cybersecurity Research — Anthropological Findings from Two Longitudinal Studies, Anwesh Tuladhar. PDF. Learning State-Dependent Sensor Measurement Models To Improve Robot Localization Accuracy, Troi André Williams. PDF. Human-centric Cybersecurity Research: From Trapping the Bad Guys to Helping the Good Ones, Armin Ziaie Tabari

  5. Computer science

    Computer science is the study and development of the protocols required for automated processing and manipulation of data. This includes, for example, creating algorithms for efficiently searching ...

  6. Computer Science and Engineering, Department of

    Deja Vu: semantics-aware recording and replay of high-speed eye tracking and interaction data to support cognitive studies of software engineering tasks—methodology and analyses, Vlas Zyrianov, Cole S. Peterson, Drew T. Guarnera, Joshua Behler, Praxis Weston, Bonita Sharif Ph.D., and Jonathan I. Maletic. 2021 PDF

  7. Discover Publications on Computing Research

    Publications by Topic. All of Computing. Computer. IEEE Open Journal of the Computer Society. IEEE Transactions on Computers. Artificial Intelligence. IEEE Intelligent Systems. IEEE Transactions on Pattern Analysis and Machine Intelligence. Biotechnology.

  8. Computer Science & Engineering

    The primary focus of the "Computer Science & Engineering" section is the field of advanced computer science and engineering. It presents high-quality papers that address state-of-the-art technology, including Deep Tech, Edge Computing, Fog Computing, Artificial Intelligence, Machine Learning, Deep Learning, Emotional Systems, Fintech ...

  9. Open research in computer science

    Open research in computer science. Spanning networks and communications to security and cryptology to big data, complexity, and analytics, SpringerOpen and BMC publish one of the leading open access portfolios in computer science. Learn about our journals and the research we publish here on this page.

  10. Review of Computer Engineering Studies

    Review of Computer Engineering Studies (RCES) is an international, scholarly and peer-reviewed journal dedicated to providing scientists, engineers and technicians with the latest developments on computer science.The journal offers a window into the most recent discoveries in four categories, namely, computing (computing theory, scientific computing, cloud computing, high-performance computing ...

  11. Frontiers in Computer Science

    Artificial Intelligence: The New Frontier in Digital Humanities. Emanuele Frontoni. Marina Paolanti. Lucia Migliorelli. Rocco Pietrini. Stavros Asimakopoulos. 1,497 views. An innovative journal that fosters interdisciplinary research within computational sciences and explores the application of computer science in other research domains.

  12. Computer Science and Engineering Theses, Projects, and Dissertations

    learn programming in virtual reality? a project for computer science students, benjamin alexander. pdf. lung cancer type classification, mohit ramajibhai ankoliya. pdf. high-risk prediction for covid-19 patients using machine learning, raja kajuluri. pdf. improving india's traffic management using intelligent transportation systems, umesh ...

  13. Undergraduate Research Topics

    Available for single-semester IW and senior thesis advising, 2024-2025. Research Areas: computational complexity, algorithms, applied probability, computability over the real numbers, game theory and mechanism design, information theory. Independent Research Topics: Topics in computational and communication complexity.

  14. Research Guides: Computer Engineering: Articles & Journals

    Selected full text articles for English-language journals, reports, conference papers, etc., in engineering, acoustics, chemistry, computers, metallurgy, physics, plastics, telecommunications, transportation, waste management, and more. Web of Science. Identifies articles and references from journals in science, the social sciences, and art and ...

  15. Research Areas

    Research areas represent the major research activities in the Department of Computer Science. Faculty and students have developed new ideas to achieve results in all aspects of the nine areas of research. ... Computer Engineering (in collaboration with the Electrical and Computer Engineering Department) More in this section. Systems and ...

  16. Strategic Research Areas

    Strategic Research Areas. Research in Electrical and Computer Engineering covers an extremely broad range of topics. Whether in computer architecture, energy and power systems or in nanotechnology devices, the research conducted in ECE is at the cutting edge of technological and scientific developments. Research. Strategic Research Areas.

  17. Computer Technology Research Paper Topics

    A list of 33 potential topics for research papers and an overview article on the history of computer technology. Topics include analog computers, artificial intelligence, computer and video games, computer displays, computer memory, and more. Learn about the development, applications, and challenges of computer technology from different perspectives.

  18. Find Articles & Papers

    Electrical and Computer Engineering Research Resources: Find Articles & Papers ... journals and magazines, standards and by topic. Also provides links to IEEE standards, IEEE spectrum and other sites. INSPEC. ... Contains full-text papers on optics and photonics from SPIE journals and proceedings published since 1990. Approximately 15,000 new ...

  19. Computer Engineering Project Topics, Ideas and Research Papers

    List of Interesting Electrical & Computer Engineering Research Project Topics. SATISFIABILITY REASONING OVER VAGUE ONTOLOGIES USING FUZZY SOFT SET THEORY. DEVELOPMENT OF AN IMPROVED PLAYFAIR CRYPTOSYSTEM USING RHOTRIX. Design, Construction, Simulation, and Performance Evaluation of a Solar Box Cooker. BILL OF ENGINEERING MEASUREMENT AND ...

  20. Computer Network Project Topics With Abstracts and Base Papers 2024

    Explore the latest M.Tech project topics in Computer Networks for 2024, featuring trending IEEE base papers. Elevate your research with cutting-edge projects covering diverse aspects of networking, from advanced algorithms to emerging technologies. Discover innovative titles, abstracts, and base papers to stay ahead in the dynamic field of Computer Networks.

  21. Research Topics

    Research in Systems Engineering at Cornell covers an extremely broad range of topics, because of this nature, the research takes on a collaborative approach with faculty from many different disciplines both in traditional engineering areas as well as those outside of engineering.