This is a two-year professional degree program designed to prepare students in the mathematical sciences for a career in contemporary industry or business. The students receive thorough training in applied mathematics and scientific computing, exposure to mathematics-related subjects in science and engineering, and experience in a group project. The program's graduates have been successful in securing desirable positions with companies ranging from small, local firms to large, international corporations.

Director: Qian-Yong Chen

**On this page:**

Program goals and structure

Student experiences and employment

Group projects

Application and admissions

Possible interdisciplinary courses

Center for Applied Mathematics and Computation

## Program goals and structure

The Master's Degree Program in Applied Mathematics is specially designed to prepare graduates for a successful career in today's industrial/business world. Accordingly, the program is structured into the following three components:

- a core of graduate courses in applied subjects within the Department of Mathematics and Statistics;
- a selection of advanced courses in other departments including, but not limited to, these;
- a group project in which an applied scientific problem is undertaken in a colloborative effort.

The graduate courses in the Department concentrate on Analytical Methods, Numerical Methods, and Probability/Statistics. These two-semester courses sequences give the student a thorough background in advanced applied mathematics.

The elective courses outside the Department are determined depending on each student's interests and preparation. In recent years, they have been chosen from Computer Science, Engineering (Industrial, Mechanical, Electrical), Physics, and Management Science. These courses expose the student to the use of practical mathematical tools by scientists and engineers.

The group project is the most novel component of this program. It is intended to emulate industrial teamwork on a large, technical problem. Through the combined efforts and diverse talents of the group members, a mathematical model is developed, a computer code is implemented, and a final report is written. In the process, the students learn how to start solving a new and hard problem, how to make a professional presentation of their work, and how to collaborate effectively with their coworkers.

## Student experiences and employment

**How is student life in the program?** A comradery develops naturally among the students through their common coursework and the group project class, which meets weekly as a seminar and requires joint work outside class. There are around ten students in the program in any given year. This size allows the students and the faculty to interact easily and frequently. Also, the second-year students often share their experiences and contacts with the first-year students.

**Where do the graduates go?** While a few find the program useful for developing their mathematics prior to pursuing other advanced degrees, most graduates find jobs in industry. Typically, these jobs are in high-technology firms, often falling under the label of software development. Some recent graduates are employed by large, well-known companies: DEC, GTE, Hewlett-Packard, MIT Lincoln Labs, Pfizer. Others work for smaller, local firms, such as Artios (Ludlow, MA) and Amherst Process Instruments (Hadley, MA).

**How do the graduates fare on the job market?** It appears that many employers prefer to hire candidates having strong mathematical training together with good programming skills, rather than those with other, more specialized, degrees. And, indeed, all the recent graduates from the program have secured good jobs upon completion of the program. Many have received several attractive offers.

**What do the recent graduates have to say?** Feedback from our graduates underscores the value of program, and the group project in particular, as preparation for the workplace. Here are a few examples of their comments:

A 1993 grad now at Lincoln Labs writes, "One thing I would like to mention is that even though I may be using more computer skills than math, I feel that my math background has helped a lot. When I was hired, my boss told me that she preferred someone with a math background who could program than a computer science major. I think that was the key for me. Also, the applied math group project is a great idea because it teaches you to work as a group and prepares you for the "real world"."

One of our 1994 alumni, who worked for a while at Fuji Capital Markets Corporation before returning to school, recalls, "Well I remember some words from my boss at FCMC : He told me that, 'A good background in mathematics and programming makes you a very valuable person for an Investment bank or a Capital Markets firm like Fuji Capital Markets. Usually people are either one or the other. If you combine both and have good conversational skills, that's exactly what employers are looking for.' And that is exactly what we tried to learn in the Master's Program : get a thorough math background, learn to program and communicate."

An alumna from 1995, now working at GTE, states, "I think the best selling point is the project. Not so much in terms of what the project is about, but rather the fact that you're working in a group where you have to deal with people not getting their portion of the project done, and project management issues. The team aspect is emphasized a lot at GTE and also I think in other companies. The C coding is also very important; even little things like the RCS configuration management tool are good things to talk about (We're using one now called ClearCase, and tho' I didn't know exactly how to use it, at least I knew the principles behind why we needed such a thing.)."

A 1994 graduate reports, "I am currently a Statistician/Quality Engineer for the Hewlett-Packard Company. The Applied Math Program at UMASS was great for me. I found it very flexible. I was able to choose many classes that built on my previous engineering degree. The faculty at UMASS are also very supportive and are available for the students. I never had a problem trying to get advice on any matter. In addition, the faculty realizes that students will be looking for a job after the program and they give the students many opportunities to make contacts and explore various professional paths."

A 1990 alumna who later received her PhD in meteorology writes, "The best thing about the applied math program is the solid theoretical background it gave me and the 'hands-on' application of that knowledge in a project. I also enjoyed enormously the course I took outside the department in fluid dynamics, which set the foundation of my current research work."

## Group projects

Each year a group project is completed by all of the current students. In a sense, this project class is the organizing experience for the students in the degree program. In addition, it serves as the thesis component of the M.S. degree. The second-year students are expected to take a leadership role in the project, along with the two faculty members who guide it. The first-year students gradually acquire the skills (as modelers, coders and communicators) that they will use during the next year, when they lead the project. The class meets as a weekly seminar throughout the academic year, although most of the real work occurs outside the classroom.

The projects from recent years are described briefly below. More details for projects prior to 2019 can be found at our Department Newsletters

### 2018-19

"Hearing" the Graph in a Directed Graph The first group recreate a spectral graph clustering method based on relaxation of the discretized wave equation on undirected graphs and attempt to generalize the result to the directed case. They find sufficient conditions for convergent relaxation and bound the time complexity of their algorithm based on the directed graph Laplacian. Using a conductance metric, they illustrate successful cases of directed graph clustering and investigate heuristic conditions for various graph properties under which clustering may be expected to succeed.

**Cancer Growth Modeling** The second group investigates the dynamics of cancer growth and treatment using a
combination of statistical, analytical, and numerical approaches. Using tumor volume
data collected from laboratory mice, they obtain parameter estimates for several
different cancer growth models. Using the parameter
estimates obtained, they then analyze the performance of different treatment regimens
on slightly more complicated models, and obtain some numerical and theoretical
results for different treatment approaches.

**Smart Grid** The third group studies the Smart Grid - an intelligent utility network that is needed
to satisfy the ever increasing demand on our electric grid. At a microscale level,
they use gradient boosting models to predict electrical power consumption and solar
power generation for a single home. They also develop a dashboard as a visualization
tool for this analysis. At a macroscale level, they explore the cost optimization of
power dispatch between various energy sources in a simplified model of the German
electric grid. Through these levels of study, they aim to establish a deeper
understanding of the potential requirements of a modern Smart Grid.

**2017-18: Electromyography Classification Using Recurrent Systems, and
Multiple Scale Modeling for Predictive Material Deformation Analysis**
This year the students were divided into two groups. The first group
investigated the classification of Electromyography (EMG) data.
They compared the classical machine learning techniques including
Support Vector Machine, Classification Trees, Neural Networks (eg RNN),
and Reservoir Computing framework. They found each method has its pros and cons.
There is no "best" approach for balancing interpretability, accuracy, and
generalization to new data sets. The second group studied the multi-scale models
for material deformation while taking into the consideration of the variations in
material properties at the micro scale. In particular, the representative volume
elements (RVEs) were employed for the averaging over stresses etc.

**2016-17: Math Systems for Diagnosis and Treatment of Breast Cancer**
This year’s Applied Math Master’s Project investigated the detection, classification
and growth of breast cancer tumors. The goal is to create a machine learning pipeline
for detection and diagnosis from mammogram images. They model the growth and
treatment of tumors using a system of ODEs. Due to a lack of human data, this last
part of the pipeline uses data from experiments on lab mice, and is not restricted
to breast cancer.

**2015-16: Non-transitive Systems in Grasslands**

This year’s Applied Math Master's Project modeled the non-transitive interactions between three plant species in a prairie and explored the effects that urban development would have on species survival and co-existence. We began by studying non-transitive systems. In a transitive system, if A defeats B and B defeats C then A defeats C; a non-transitive system is one that does not follow these rules. The prototypical example is the game of Rock-Paper-Scissors (RPS), where Paper covers Rock, Rock crushes Scissors, and Scissors cuts Paper. Mathematically, non-transitive systems (often called RPS games) have been examined using discrete and continuous techniques and applied to model multi-player interactions in fields such as biology, economics, and social networking.

**2014-15: Uncertainty Quantification: The Story Begins**

This year’s Applied Math Masters project utilized the emerging field of uncertainty quantification to focus on a topic of concern to human health: developing a Susceptible-Infected-Removed model (SIR) for forecasting the spread of dengue hemorrhagic fever (DHF).

**2013-14: From Atomic Physics and Materials Science to Financial Mathematics and Beyond**

This year the Applied Mathematics Masters’ Program tackled a diverse variety of themes in the context of its yearly project: Bose-Einstein condensates in atomic physics; granular crystals in materials science; and deterministic (and potentially chaotic) models of supply-demand-pricing and inflation in financial markets.

**2012-13: Multi-Agent Models**

This year the Masters students in the Applied Mathematics program undertook a multi-faceted project related to the broad theme of Multi-Agent Models. This theme was chosen for two reasons. First, it offered an opportunity to model the collective behavior of complex systems arising in a range of different disciplines. In particular, three subgroups of the students studied dynamical systems from physical chemistry, biology, and finance. Second, this theme demanded a computationally intensive approach, and the students themselves expressed a desire to use the project to push the limits of their abilities as computational modelers.

**2011-12: Power Grids and Energy Transmission**

Energy has become an important issue across the whole spectrum of our society. Overall electricity production, one of the most important forms of energy, is often used as an indicator whether a country is industrialized or non-industrialized. The power grids used to transfer electricity from the generators to consumers have a tremendous scale and are becoming ever more complicated as more power plants are built to meet the ever increasing demand. This year the students in the Applied Math Master’s Degree Program worked on three different projects that deal with three different aspects of power grids and energy transmission.

**2010-11: Numerical Optimization of Airport Traffic**

For this year’s group project in applied math, the students modeled the efficiency of airport taxi way operations, with the aim of improving the scheduling of departing and arriving flights at a busy airport. This problem was suggested by a former graduate from our department, Richard Jordan (Ph.D., 1994), who is currently working for the MIT Lincoln Laboratories. Rich’s group in the Lincoln Lab is under contract from the FAA to update various aspects of airport operations by means of modern automation.

**2009-10: Microscopic Traffic Flow Modeling, and Compressive Sampling **

In the past year, the Applied Mathematics Masters students were divided into two groups and worked on two separate projects. The first group worked on a project about “Compressive Sampling,” which is a state-of-the-art technique to compress data during acquisition. The basic idea goes back to the 1970s, when seismologists first use the reflected waves to construct an image of the Earth’s interior structure. But the field exploded around 2004 after David Donoho, Emmanuel Candes, Justin Romberg and Terence Tao discovered that the minimum number of data needed to reconstruct an image is less than that required by the famous Nyquist-Shannon criterion.

The second group worked on a project of Microscopic Traffic Flow Modeling. Different from macroscopic models, which treat traffic flow as an effectively one-dimensional compressible fluid, microscopic traffic models are built up from the minute level of individual cars and the interactions between them. The car-following model is one such model based on the stimulus-response mechanism — the following car takes actions like acceleration or deceleration whenever there is stimulus from the leading car, like a change of relative speed or headway. Ideally, models of this kind should be able to reproduce common traffic phenomena, such as stop-and- go, platoon diffusion, or spontaneous congestion. In practical situations they could be used to predict traffic conditions on major roads and to aid traffic control procedures.

**2008-09: Modeling Climate Change**

At first sight, there is no easy entry point for mathematical modelers into the extremely complex subject of climate dynamics. State-of-the-art climate predictions are based on elaborate numerical models that attempt to include all relevant physical processes in the entire Earth system. These numerical simulators, which grew out of weather-prediction technology, are generically called GCMs, meaning General Circulation Models, although nowadays perhaps Global Climate Models is a more appropriate term. Their governing equations incorporate the circulation of the atmosphere as well as its radiative physics and chemistry (carbon dioxide, ozone, aerosols), the circulations of the oceans and coupling through the hydrosphere (water vapor, clouds, glaciers, sea ice), and even aspects of the biosphere (forests, soils, marine biota). Models with this level of complexity take decades to develop, test, and tune, and they are very expensive to run. Moreover, the results and predictions that they produce are often quite hard to interpret, especially if the goal is to identify a particular mechanism and its effects.

**2007-08: Cancerous Tumor, and Data Compression**

This year, students in the program worked on two projects. In the first project they looked at models of blood vessel growth towards a cancerous tumor. A critical question for a patient diagnosed with cancer is whether the disease is local or has spread to other locations. Cancer cells penetrate into lymphatic and blood vessels, circulate through the bloodstream, and then invade and grow in normal tissues elsewhere. This mechanism of spreading is called metastasis. Its ability to spread to other tissues and organs makes cancer a life-threatening disease. Hence, there is naturally a great interest in understanding what makes metastasis possible for a malignant tumor. One of the key findings of cancer researchers studying the conditions necessary for metastasis is the fact that the growth of new blood vessels is critical in this respect.

In the second group project, the applied math studied data compression. In computer science and information theory, data compression is the process of encoding information using fewer bits than in the uncoded representation. A popular instance of compression is the ZIP file format. As with any communication, compressed data communication is useful only when the sender and receiver understand the coding scheme. Compression is useful because it reduces the amount of space required for storage of the original data. On the other hand, compressed data must be decompressed in order to be used, and the additional processing could be harmful to some applications. For example, a compressed video may require expensive hardware for the video to be decompressed fast enough to be viewed while it is being decompressed.

**2006-07: The Mathematics of Climate**

The group worked on mathematical models of global climate. The first person in history to publish a scientific paper on the physical principles that underlie climate — namely, the overall effect of solar radiation and its interaction with the Earth’s surface and atmosphere — was the famous mathematician Joseph Fourier. In the 1820s the father of the heat equation asked himself how it is that the Earth maintains an equilibrium temperature and what that temperature should be. He first wondered why the Earth is not much hotter than it is, given that it is continually heated by the Sun. He realized that the Earth balances the solar radiation it receives by emitting lower frequency (infrared) radiation back into space. But then his calculations suggested that the equilibrium temperature should be below freezing worldwide. The discrepancy lay in the fact that some gases in the atmosphere absorb the reflected radiation even though they are almost transparent to the solar radiation. Of course, these are the greenhouse gases, principally water vapor and carbon dioxide. Although science was much too primitive in Fourier’s time for him to make a thorough analysis, his simple picture of the key processes has stood the test of time, and today it underlies the urgent debate on global warming.

**2005-06: The Google Search Engine and the Mechanics of Human Locomotion**

The group worked on two projects. The first project was to model the search engine Google. After people type keywords in Google, it prepares a list of websites associated with those keywords. By applying the power method in numerical analysis, the students were able to simulate the page-rank processing of a network. They wrote a program, called a webcrawler, that crawls the internet site by site to determine how sites are interconnected. This created a network of 60,000 sites containing the Department of Mathematics and Statistics and its connected sites. The students then applied the page-rank algorithm to this network. The students also applied the same algorithm to other topics such as developing a ranking system of US airports that would help determine which airports are the most important according to the numbers of passengers. Once again using the power method and applying it to an actual data set obtained from the Bureau of Transportation, the students concluded that Dallas/Fort Worth International Airport is the most important airport in the US.

The second project focused on running. Undertaking this project involved input from many academic areas. The students first had to understand the physiology of the leg as well as the mechanics of how the leg moves and interacts with various forces during running. As they learned, the running process can be broken down into two phases. The stance phase is the period of time when the foot is still in contact with the ground, and the flight phase is the period of time when the foot and the body are in the air. Each phase is governed by a different set of equations derived from Newton's laws of motion. The stance phase is described by three second-order differential equations while the flight phase is described by the equations for common projectile motion found in physics. The students solved the coupled differential equations for the two phases numerically in order to simulate running.

**2004-05: Pattern Formation, Tumor Growth and Turing Instability**

The spots and stripes which occur on plants and animals is modeled by the group. The model is based on Turing instability or diffusion driven instability. A coupled pair of partial differential equations are used to model the pattern formation. These equations are studied analytically to understand when instabilities will occur. The equations are then solved with a finite difference method numerically in different domains with varying parameters producing spots, stripes and combinations of the two.

A simple model of tumor growth is also proposed. The model is based on the tumor releasing a chemical(TAF) and using this chemical to recruit blood vessels to proliferate in its direction and eventually vascularize the tumor. The model involved a partial differential equation for the chemical and one for the blood vessels. The equations were solved with a finite difference method numerically. The solutions were similar to what is observed.

**2003-04: Traffic Flow with Cellular Automata and Kinetic Models**

The group looked at different models for simulating traffic flow. The primary model was one based on cellular automata. This model uses a finite set of vehicles with a finite set of rules governing their interaction. The results gave very realistic results. The group showed that one can predict the mean velocity of a collection of vehicles depending on the density. One and two lanes were modeled along with stop lights and ramps.

The group also derived a kinetic model, one between the microscopic cellular automata and the macroscopic partial differential equation. The kinetic model produced solutions similar to the cellular automata which the differential equation is not capable of.

**2002-03: Modeling of the Kidney and Lungs**

The regulation of sodium chloride in the kidney is modeled. Each kidney contains over one million nephrons, the basic functional unit of the kidney. Each nephron regulates the composition of sodium chloride amongst other things. The transport of sodium chloride in the loop of Henle, which is part of the nephron, is modeled with a partial differential equation. Using analysis, the partial differential equation is studied to understand when stable and unstable solutions might occur. The equation is solved numerically using the Lax-Wendroff method. The computed solutions exhibit oscillations in the sodium concentration in time as is predicted by the analysis. This is observed in rats and humans.

The group also worked with a research pulmonologist at Bay State Medical Center looking at the amount of carbon dioxide exhaled in healthy patents and patents with asthma versus time. The group tested different methods for removing the noise from the data(smoothing the data). The group proposed several good methods that the pulmonologist could use in his work.

**2001-02: Artificial Neural Networks**

**2000-01: Dynamic control of a multilink mechanical system**

**1999-00: Modeling and visualizing human movement via mechanics and optimal control**

Image(s) from this project:

1999_00.jpg

**1998-99: Quasi-geostrophic turbulence modelling using pseudospectral methods**

The objective of this project is to develop a mathematical model for forcasting atmospheric pressure patterns. The common assumption is made that the atmosphere can be modelled as an incompressible fluid. The laws governing atmospheric pressure changes are then described using a Navier-Stokes equation in a rotating coordinant frame. Solutions to this nonlinear partial differential equation are obtained numerically, by means of pseudospectral method.

Image(s) from this project:

initial.gif

output200.gif

output400.gif

output600.gif

output800.gif

**1997-98: Macroscopic modelling of traffic flow**

Traffic flow is modelled through a hydrodynamic analogy, and the resulting nonlinear hyperbolic partial differential equation is solved numerically. In addition, on-ramps, off-ramps, and bottlenecks are modelled, and these complexities also are implemented in the computer simulation program. Then, in order to model a two lane highway, the concept of lane changing is examined. A model of lane changing from the literature is discussed, and it is argued that this formulation is incorrect. Moreover, a modified lane changing model is presented, and its validity is supported by the results of several simulations, again performed through the numerical solution of the governing differential equation. Finally, in order to illustrate the interrelationship between the effects of ramps and bottlenecks and the process of lane changing, results are presented for simulations which model ramps and bottlenecks along a two lane highway.

Image(s) from this project:

1997_8.gif

**1996-97: Monte-Carlo simulation of turbulent atmospheric diffusion**

The physical and mathematical diffusion of particles through a turbulent velocity field was calculated via two methods, a Random Eddy Model and Fourier Spectrum Model. The Random Eddy algorithm simulates a lattice of Rankine vortices; the Fourier Spectrum code utilizes a sum of sine and cosine terms to approximate the stream function of the turbulent velocity field. Spatial correlation experiments were performed to ensure appropriate behavior for the moving particles, as well as parameter choices. Simulations of particle emanation from a smoke stack were also performed.

Image(s) from this project:

1996_7.gif

**1995-96: Acoustic radiation and propagation**

The sound field of a planar generator of general shape and/or mode was calculated by using a surface integral representation of the solution to the governing Helmholtz equation. Comparisons were made with some classical formulas available either in simple, symmetric cases, or in asymptotic regimes. Interesting interference patterns in the sound intensity nearby the radiator were detailed over a range of frequencies and generator characteristics. The directivity of these sound generators was also studied.

Image(s) from this project:

1995_6.gif

**1994-95: Models of convective turbulent diffusion**

The steady-state concentration field of a pollutant introduced into a flowing, turbulent atmosphere was analyzed. A finite-difference method (alternating direction implicit) was implemented to solve the variable-coefficient diffusion equation in three dimensions, under a parabolic approximation in which the downstream variable is time-like. The plume formed by a source was computed and displayed graphically for various sheared wind-flow conditions.

Image(s) from this project:

1994_5.gif

**1993-94: Optics analysis**

The design of a lens system was tackled using a direct numerical approach based on ray-tracing for the geometrical optics. Optical properties (focussing, magnification) of various instruments (simple telescopes, microscopes) were examied by computing the three-dimensional pencils of rays, without the classical paraxial approximation. Then the aberrations (spherical, coma, astigmatism, ...) were quantified numerically, and an optimization code was used to vary the lens system parameters so as to minimize a given aberration.

Image(s) from this project:

1993_4.gif

**1992-93: Spectral computations in fluid dynamics**

The behavior of a two-dimensional viscous fluid was simulated by a direct numerical computation using a pseudospectral method. First, some simpler one-dimensional codes were written for the Burgers and Korteweg-DeVries equations, and some wave interaction phenomena governed by these equations were studied. Then, the full code for a Navier-Stokes flow in two dimensions was implemented, and various vortex interactions were displayed.

## Applications and admissions

Those wishing to be considered for Fall admission should submit all application materials to the Graduate Admissions Office during the preceding Spring. Applications are reviewed beginning on February 1, with precedence given to those before that date. Later applications are considered provided that openings are available. Applicants are encouraged to visit in person, if possible, to meet the faculty and students in the program.

All applicants are expected to have a strong undergraduate preparation in mathematics, including advanced calculus, linear algebra, and differential equations. Some exposure to computer science and/or scientific computing is also desirable, as is some knowledge of another area of science or engineering. A Bachelor's Degree in Mathematics, however, is not necessary. Students with undergraduate majors in Physics or Engineering, for instance, and with sufficient mathematical background, are encourage to apply.

The program is able to offer a tuition waiver and a stipend to a limited number of students upon admission. This financial support takes the form of a teaching assistantship in the department. The duties of the students in the Master's Degree Program are usually restricted to grading or consulting for an undergraduate course, although instructing in an elementary course is also possible.

For additional information, contact the Program Director Qian-Yong Chen.

## Possible interdisciplinary courses

### Computer science

**COMPSCI 513: Logic in Computer Science**

Rigorous introduction to mathematical logic from an algorithmic perspective. Topics include: Propositional logic: Horn clause satisfiability and SAT solvers; First Order Logic: soundness and completeness of resolution, compactness theorem. We will use the Coq theorem prover and Datalog. Prerequisites: COMPSCI 250 and COMPSCI 311. 3 credits.

**COMPSCI 575: Combinatorics and Graph Theory**

This course is a basic introduction to combinatorics and graph theory for advanced undergraduates in computer science, mathematics, engineering and science. Topics covered include: elements of graph theory; Euler and Hamiltonian circuits; graph coloring; matching; basic counting methods; generating functions; recurrences; inclusion-exclusion; and Polya's theory of counting. Undergraduate Prerequisites: mathematical maturity; calculus; linear algebra; strong performance in some discrete mathematics class, such as COMPSCI 250 or MATH 455. Modern Algebra - MATH 411 - is helpful but not required. 3 credits.

**COMPSCI 585: Introduction to Natural Language Processing**

Natural Language Processing (NLP) is the engineering art and science of how to teach computers to understand human language. NLP is a type of artificial intelligence technology, and it's now ubiquitous -- NLP lets us talk to our phones, use the web to answer questions, map out discussions in books and social media, and even translate between human languages. Since language is rich, subtle, ambiguous, and very difficult for computers to understand, these systems can sometimes seem like magic -- but these are engineering problems we can tackle with data, math, machine learning, and insights from linguistics. This course will introduce NLP methods and applications including probabilistic language models, machine translation, and parsing algorithms for syntax and the deeper meaning of text. During the course, students will (1) learn and derive mathematical models and algorithms for NLP; (2) become familiar with basic facts about human language that motivate them, and help practitioners know what problems are possible to solve; and (3) complete a series of hands-on projects to implement, experiment with, and improve NLP models, gaining practical skills for natural language systems engineering. Undergraduate Prerequisites: COMPSCI 220 (or COMPSCI 230) and COMPSCI 240. An alternate prerequisite of LINGUIST 492B is acceptable for Linguistics majors. 3 credits.

**COMPSCI 589: Machine Learning**

This course will introduce core machine learning models and algorithms for classification, regression, clustering, and dimensionality reduction. On the theory side, the course will focus on understanding models and the relationships between them. On the applied side, the course will focus on effectively using machine learning methods to solve real-world problems with an emphasis on model selection, regularization, design of experiments, and presentation and interpretation of results. The course will also explore the use of machine learning methods across different computing contexts. Students will complete programming assignments and exams. Python is the required programming language for the course. Prerequisites: COMPSCI 383 and MATH 235. 3 credits.

**COMPSCI 590D: Algorithms for Data Science**

Big Data brings us to interesting times and promises to revolutionize our society from business to government, from healthcare to academia. As we walk through this digitized age of exploded data, there is an increasing demand to develop unified toolkits for data processing and analysis. In this course our main goal is to rigorously study the mathematical foundation of big data processing, develop algorithms and learn how to analyze them. Specific Topics to be covered include: 1) Clustering 2) Estimating Statistical Properties of Data 3) Near Neighbor Search 4) Algorithms over Massive Graphs and Social Networks 5) Learning Algorithms 6) Randomized Algorithms. This course counts as a CS Elective toward the CS major (BS/BA). Undergraduate Prerequisites: COMPSCI 240 and COMPSCI 311. 3 credits.

**COMPSCI 590IV + 690IV: Intelligent Visual Computing**

The course will teach students algorithms that intelligently process, analyze and generate visual data. The course will start by covering the most commonly used image and shape descriptors. It will proceed with statistical models for representing 2D images, textures, 3D shapes and scenes. The course will then provide an in-depth background on topics of shape and image analysis and co-analysis. Particular emphasis will be given on topics of automatically inferring function from shapes, as well as their contextual relationships with other shapes in scenes and human poses. Finally, the course will cover topics on automating the design and synthesis of 3D shapes with machine learning algorithms and advanced human-computer interfaces. Students will read, present and critique state-of-the-art research papers on the above topics. This course counts as a CS Elective toward the CS major (BA/BS). 3 credits.

**COMPSCI 590N: Introduction to Numerical Computing with Python**

This course is an introduction to computer programming for numerical computing. The course is based on the computer programming language Python and is suitable for students with no programming or numerical computing background who are interested in taking courses in machine learning, natural language processing, or data science. The course will cover fundamental programming, numerical computing, and numerical linear algebra topics, along with the Python libraries that implement the corresponding data structures and algorithms. The course will include hands-on programming assignments and quizzes. No prior programming experience is required. Familiarity with undergraduate-level probability, statistics and linear algebra is assumed. 1 credit.

**COMPSCI 590V: Data Visualization and Exploration**

In this course, students will learn the fundamental algorithmic and design principles of visualizing and exploring complex data. The course will cover multiple aspects of data presentation including human perception and design theory; algorithms for exploring patterns in data such as topic modeling, clustering, and dimensionality reduction. A wide range of statistical graphics and information visualization techniques will be covered. We will explore numerical data, relational data, temporal data, spatial data, graphs and text. Hands-on projects will be based on Python or JavaScript with D3. This course counts as a CS Elective toward the CS major (BA/BS). Undergraduate Prerequisite: COMPSCI 220 or 230. No prior knowledge of data visualization or exploration is assumed. This course counts as a CS Elective toward the CS major (BA/BS). 3 credits.

**CICS 597C Introduction to Computer Security**

This course provides an introduction to the principles and practice of computer and network security with a focus on both fundamentals and practical information. The key topics of this course are applied cryptography; protecting users, data, and services; network security, and common threats and defense strategies. Students will complete several practical lab assignments involving security tools (e.g., OpenSSL, Wireshark, Malware detection). The course includes homework assignments, quizzes, and exams. Prerequisites are CICS 290S or equivalent experience with instructor permission. 3 credits.

**COMPSCI 611: Advanced Algorithms**

Principles underlying the design and analysis of efficient algorithms. Topics to be covered include: divide-and-conquer algorithms, graph algorithms, matroids and greedy algorithms, randomized algorithms, NP-completeness, approximation algorithms, linear programming. Prerequisites: The mathematical maturity expected of incoming Computer Science graduate students, knowledge of algorithms at the level of COMPSCI 311. 3 credits.

**COMPSCI 617: Computational Geometry**

Geometric algorithms lie at the heart of many applications, ranging from computer graphics in games and virtual reality engines to motion planning in robotics or even protein modeling in biology. This graduate course is an introduction to the main techniques from Computational Geometry, such as convex hulls, triangulations, Voronoi diagrams, visibility, art gallery problems, and motion planning. The class will cover theoretical as well as practical aspects of the field. The goal of the class it to enable students to exploit a broad range of algorithmic tools from computational geometry to solve problems in a variety of application areas. Prerequisite: Mathematical maturity; CMPSCI 611 or CMPSCI 601. Eligibility: Graduate CS students only. Others with permission of instructor. 3 credits.

**COMPSCI 660 Advanced Information Assurance**

This course provides an in-depth examination of the fundamental principles of information assurance. While the companion course for undergraduates is focused on practical issues, the syllabus of this course is influenced strictly by the latest research. We will cover a range of topics, including authentication, integrity, confidentiality of distributed systems, network security, malware, privacy, intrusion detection, intellectual property protection, and more. Prerequisites: COMPSCI 460 or 466, or equivalent. 3 credits.

**COMPSCI 682: Neural Networks: A Modern Introduction**

This course will focus on modern, practical methods for deep learning. The course will begin with a description of simple classifiers such as perceptrons and logistic regression classifiers, and move on to standard neural networks, convolutional neural networks, and some elements of recurrent neural networks, such as long short-term memory networks (LSTMs). The emphasis will be on understanding the basics and on practical application more than on theory. Most applications will be in computer vision, but we will make an effort to cover some natural language processing (NLP) applications as well, contingent upon TA support. The current plan is to use Python and associated packages such as Numpy and TensorFlow. Prerequisites include Linear Algebra, Probability and Statistics, and Multivariate Calculus. Some assignments will be in Python and some in C++. 3 credits.

**COMPSCI 687: Reinforcement Learning**

This course will provide an introduction to, and comprehensive overview of, reinforcement learning. In general, reinforcement learning algorithms repeatedly answer the question "What should be done next?", and they can learn via trial and error to answer these questions even when there is no supervisor telling the algorithm what the correct answer would have been. Applications of reinforcement learning span across medicine (How much insulin should be injected next? What drug should be given next?), marketing (What ad should be shown next?), robotics (How much power should be given to the motor?), game playing (What move should be made next?), environmental applications (Which countermeasure for an invasive species should be deployed next?), and dialogue systems (What type of sentence should be spoken next?), among many others. Broad topics covered in this course will include: Markov decision processes, reinforcement learning algorithms (model-based / model-free, batch / online, value function based, actor-critics, policy gradient methods, etc.), hierarchical reinforcement learning, representations for reinforcement learning (including deep learning), and connections to animal learning. Special topics may include ensuring the safety of reinforcement learning algorithms, theoretical reinforcement learning, and multi-agent reinforcement learning. This course will emphasize hands-on experience, and assignments will require the implementation and application of many of the algorithms discussed in class. PREREQUISITES: COMPSCI 589, or COMPSCI 689, or COMPSCI 683, with a grade of C or better. Familiarity with an object oriented programming language is required (assignments will use C++, but familiarity with C++ specifically will not be assumed). 3 credits.

**COMPSCI 688: Probabilistic Graphical Models**

Probabilistic graphical models are an intuitive visual language for describing the structure of joint probability distributions using graphs. They enable the compact representation and manipulation of exponentially large probability distributions, which allows them to efficiently manage the uncertainty and partial observability that commonly occur in real-world problems. As a result, graphical models have become invaluable tools in a wide range of areas from computer vision and sensor networks to natural language processing and computational biology. The aim of this course is to develop the knowledge and skills necessary to effectively design, implement and apply these models to solve real problems. The course will cover (a) Bayesian and Markov networks and their dynamic and relational extensions; (b) exact and approximate inference methods; (c) estimation of both the parameters and structure of graphical models. Although the course is listed as a seminar, it will be taught as a regular lecture course with programming assignments and exams. Students entering the class should have good programming skills and knowledge of algorithms. Undergraduate-level knowledge of probability and statistics is recommended. 3 credits.

**COMPSCI 689: Machine Learning**

Machine learning is the computational study of artificial systems that can adapt to novel situations, discover patterns from data, and improve performance with practice. This course will cover the popular frameworks for learning, including supervised learning, reinforcement learning, and unsupervised learning. The course will provide a state-of-the-art overview of the field, emphasizing the core statistical foundations. Detailed course topics: overview of different learning frameworks such as supervised learning, reinforcement learning, and unsupervised learning; mathematical foundations of statistical estimation; maximum likelihood and maximum a posteriori (MAP) estimation; missing data and expectation maximization (EM); graphical models including mixture models, hidden-Markov models; logistic regression and generalized linear models; maximum entropy and undirected graphical models; nonparametric models including nearest neighbor methods and kernel-based methods; dimensionality reduction methods (PCA and LDA); computational learning theory and VC-dimension; reinforcement learning; state-of-the-art applications including bioinformatics, information retrieval, robotics, sensor networks and vision. Prerequisites: undergraduate level probability and statistics, linear algebra, calculus, AI; computer programming in some high level language. 3 credits.

**COMPSCI 690LG: Advanced Logic in Computer Science**

Rigorous introduction to mathematical logic from an algorithmic perspective. Topics include: Propositional logic: Horn clause satisfiability and SAT solvers; First Order Logic: soundness and completeness of resolution, compactness theorem, automatic theorem proving, model checking. We will learn about and use the Coq theorem prover, Datalog, a Model Checker, and SAT and SMT solvers. Prerequisites: Students taking this course should have undergraduate preparation in discrete math and algorithms. Requirements will include readings, class participation, weekly problem sets, a midterm and a final project. 3 credits.

**COMPSCI 690M: Machine Learning Theory**

When, how, and why do machine learning algorithms work? This course answers these questions by studying the theoretical aspects of machine learning, with a focus on statistically and computationally efficient learning. Broad topics will include: PAC-learning, uniform convergence, and model selection; supervised learning algorithms including SVM, boosting, kernel methods; online learning algorithms and analysis; unsupervised learning with guarantees. Special topics may include: Bandits, active learning, semi-supervised learning and others. 3 credits.

**COMPSCI 690V: Visual Analytics**

In this course, students will work on solving complex problems in data science using exploratory data visualization and analysis in combination. Students will learn to deal with the Five V s: Volume, Variety, Velocity, Veracity, and Variability, that is with large data, complex heterogeneous data, streaming data, uncertainty in data, and variations in data flow, density and complexity. Students will be able to select the appropriate tools and visualizations in support of problem solving in different application areas. The course is a practical continuation of COMPSCI 590V - Data Visualization and Exploration and focuses on complex problems and applications, however 590V is not a prerequisiteandboth 590V and 690V may be taken independently of each other. The data sets and problems will be selected mainly from the IEEE VAST Challenges, but also from the KDD CUP, Amazon, Netflix, GroupLens, MovieLens, Wiki releases, Biology competitions and others. We will solve crime, cyber security, health, social, communication, marketing and similar large-scale problems. Data sources will be quite broad and include text, social media, audio, image, video, sensor, and communication collections representing very real problems. Hands-on projects will be based on Python or R, and various visualization libraries, both open source and commercial. 3 credits.

**COMPSCI 691E: Interactive Machine Learning**

Interactive machine learning involves an algorithm or an agent making decisions about data collection, contrasting starkly with traditional learning paradigms. Interactive data collection often enables learning with significantly less data, and it is critical in a number of applications including personalized recommendation, medical diagnosis, and dialogue systems. This seminar will focus on the design and analysis of interactive learning algorithms for settings including active learning, bandits, reinforcement learning, and adaptive sensing. We will cover foundational and contemporary papers, with an emphasis on algorithmic design principles as well as understanding and proving performance guarantees. Students enrolled in the 3 credit version of the course will present one paper in detail to the class as well as prepare notes for one additional lecture. Students enrolled in the 1 credit version of the course will prepare notes for one lecture. Lect 01=3 credits; Lect 02=1 credit.

### Mechanical and industrial engineering

**MIE 586 - Quantitative Decision Making**

Survey in operations research. Introduction to models and procedures for quantitative analyses of decision problems. Topics include linear programming and extensions, integer programming. Required for IE graduate students who lack operations research exposure.

**MIE 605 - Finite Element Analysis**

The underlying mathematical theory behind the finite element method and its application to the solution of problems from solid mechanics. Includes a term project involving the application of the finite element method to a realistic and sufficiently complex engineering problem selected by the student and approved by the instructor; requires the use of a commercial finite element code.

**MIE 644 - Applied Data Analysis**

The basics of data acquisition and analysis, pattern classification, system identification, neural network modeling, and fuzzy systems. Essential to students whose thesis projects involve experimentation and data analysis.

**MIE 684 - Stochastic Processes In Industrial Engineering**

Introduction to the theory of stochastic processes with emphasis on Markov chains, Poisson processes, markovian queues and networks, and computational techniques in Jackson networks. Applications include stochastic models of production systems, reliability and maintenance, and inventory control.

**MIE 707 - Viscous Fluids**

Exact solutions to Navier-Stokes flow and laminar boundary layer flow. Introduction to transition and turbulent boundary layers, and turbulence modeling. Boundary layer stability analysis using pertubation methods.

### Civil and environmental enginerring

**CEE 511 - Traffic Engineering**

Fundamental principles of traffic flow and intersection traffic operations including traffic data collection methods, traffic control devices, traffic signal design, and analysis techniques. Emphasizes quantitative and computerized techniques for designing and optimizing intersection signalization. Several traffic engineering software packages used.

**CEE 548 - Finite Element Method**

Application of numerical methods to solution of problems of structural mechanics. Finite difference techniques and other methods for solution of problems in the vibration, stability, and equilibrium of structural elements.

**CEE 605 - Finite Element Analysis**

Introduction to finite element method in engineering science. Derivation of element equations by physical, variational, and residual methods. Associated computer coding techniques and numerical methods. Applications.

### Management

**Sch-Mgmt 640 - Financial Analysis and Decisions**

Basic concepts, principles, and practices involved in financing businesses and in maintaining efficient operation of the firm. Framework for analyzing savings-investment and other financial decisions. Both theory and techniques applicable to financial problem solving.

**Sch-Mgmt 641 - Financial Management**

Internal financial problems of firms: capital budgeting, cost of capital, dividend policy, rate of return, and financial aspects of growth. Readings and case-studies.

**Sch-Mgmt 745 Financial Models**

Analytical approach to financial management. Emphasis on theoretical topics of financial decision making. Through use of mathematical, statistical, and computer simulation methods, various financial decision making models are made.

**Sch-Mgmt 747 - Theory of Financial Markets**

In-depth study of portfolio analysis and stochastic processes in security markets. Emphasis on quantitative solution techniques and testing procedures.

**Sch-Mgmt 871 - Micro Theory Of Finance**

Optimum financial policies and decisions of nonfinancial firms. Theory of competition and optimum asset management of financial firms.