University of Lethbridge > Faculty of Arts & Science > Mathematics & Computer Science > Research

Mathematics and Computer Science
COLLOQUIUM

SUMMER 2009

 
Tuesday
May 5, 11:00-11:50, room D634
Jonathan Backer
Postdoctoral fellow in Computer Science, at the University of Saskatchewan.
PhD, UBC, Vancouver, 2009.
Research interests: computational geometry, bounded curvature motion planning, problems of a combinatorial nature.
Title: Variants of the maximum empty rectangle problem.
The maximum empty rectangle (MER) problem is as follows: Given a set of points X in the plane and an axis-aligned rectangle E, find the largest-area axis-aligned rectangle R such that R is contained in E and R contains no point of X. In this talk, I will discuss new unpublished results on variants of this problem. In particular, I talk about higher dimensions, the restriction to cubes, and bichromatic objectives (instead of maximising area, try to contain as many blue points but no red points). These new results are nice because they use standard geometric techniques (sweep-planes) and rely on (relatively) easy-to-visualise geometric observations.
Friday
June 5, 11:00-11:50, room D630
Micah B. Milinovich
Assistant Professor, University of Mississippi
Ph.D., University of Rochester, 2008
Research interests: Analytic Number Theory, especially the theory of the Riemann zeta-function, L-functions, and multiplicative number theory.
Title: Mean-value estimates for the Riemann zeta-function.
The distribution of the prime numbers within the integers is intimately connected to the distribution of the zeros of the Riemann zeta-function. For this reason, questions concerning the zeros of the Riemann zeta-function are considered fundamental in analytic number theory. A natural way to study the zeros of an analytic function is through the use of mean-value theorems, that is, by studying averages of the function along a line, around a disc, or over a discrete set of points. In this talk we give an overview of these ideas and conclude by discussing some results and conjectures concerning mean-value estimates for the Riemann zeta-function.

SPRING 2009

 
Wednesday
Feb. 11
3:00-3:50pm
D 511
Jonathan Seldin
Associate professor of mathematics at the University of Lethbridge.

Research interests:
Combinatory logic, Lambda-Calculus, proof theory, history and philosophy of mathematics.
Natural Deduction and Sequent Calculi
Natural deduction systems of formal logic are systems with two rules for each connective or quantifier: one to introduce it and another to eliminate it. (There are minor exceptions for rules involving negation.) Sequent calculi (called consecution calculi by some logicians) are systems in which the statements to be proved are of the form Γ |- A, where Γ is a sequence of formulas (premises) and A is a formula, and the statement is interpreted as saying that the formula A can be deduced from the premises Γ. These systems also have rules in pairs, one for introducing each connective or quantifier on the left of the symbol "|-" and the other for introducing it on the right. The transitivity of "|-" is proved by a result called the cut elimination theorem, which is equivalent to the normalization theorem for natural deduction systems. In this talk, I will discuss the relationship between the two kinds of systems and between the cut elimination theorem and the normalization theorem.
Thursday
Feb. 26
12:15-13:05
room TH201
Katalin Bimbó
Dr. Katalin Bimbó is an Assistant Professor in the Department of Philosophy of the University of Alberta.

Research interests:
nonclassical logics, combinatory logics, foundations of mathematics, computability, theoretical and philosophical problems of computer science, questions of information processing and computer security.
The disjunction property in B+T
The "minimal" relevance logic B has been known for some time. In the logic, B the premise combining connective is not stipulated to have any extra properties (such as commutativity). The conjunction-implication fragment (with T) is the logic behind the intersection type discipline. This talk is about the positive fragment of B with the constant T added (i.e., B+T), as well as about a property that B+T has --- the disjunction property. A consecution calculus for B+ was introduced in the book on relational semantics by Bimbo and Dunn 2008. We announced there (without including a detailed proof) both the admissibility of the single cut rule and the disjunction property for B+. I will show that the addition of T is not difficult. However, the complications, that are caused by the lack of permutation and associativity for fusion, persist in B+T --- both in the formulation of the sequent calculus and in the proof of the cut theorem. I will explain why the usual category theoretic view (that is related to the multiplicative fragment of linear logic) is not applicable directly. Time permitting I will sketch another proof of the disjunction property based on a relational semantics for B+T.
Monday
March 09, 12:00-12:50, room D634
John Sheriff
John Sheriff is an Assistant Professor in statistics at the University of Lethbridge.
Techniques for Calibrating Derivative Security Pricing Models
Model calibration involves the problem of identifying the stochastic process that describes the behaviour of some underlying asset based upon prices of related derivative securities. This represents a challenging and often ill-posed problem. This challenge is exacerbated as one employs more complex models of the stochastic process describing market prices in an attempt to more accurately capture observed financial market behaviour. Examples of such models include jump-diffusion processes and/or processes incorporating deterministic, or stochastic, volatility. The talk will discuss the problem of calibrating jump-diffusion and deterministic volatility models of stock price, using information contained in available option prices. Regularization and the use of evolutionary algorithms are introduced as two techniques for meeting the calibration challenge.
Friday
March 27
12:00-12:50, room D634
Saieed Akbari
Professor of mathematics at the Sharif University of Technology
Senior Associate Researcher at School of Mathematics, Institute for Research in Fundamental Sciences (IPM) (IPM) in Teheran, Iran.
On Zero-Sum Flows in Graphs and Designs
For an undirected graph G, a "zero-sum flow" is an assignment of non-zero real numbers to the edges, such that the sum of the values of all edges incident with each vertex is zero. It has been conjectured that if a graph G has a zero-sum flow, then it has a zero-sum 6-flow. Among other results it is shown that if G is an r-regular graph (r >= 3), then G has a zero-sum 7-flow. Furthermore, if r is divisible by 3, then G has a zero-sum 5-flow. We generalize the concept of zero-sum flows for 2-designs. More precisely, by a "zero-sum flow" for a 2-design, we mean a nowhere-zero vector in the null space of its incidence matrix. We show that every Steiner triple system admits a zero-sum flow.
Monday
March 30, 12:00-12:50, room D634
Julita Vassileva

Dr. Vassileva received her Ph.D in Mathematics and Computer Science from the University of Sofia, Bulgaria.
She moved to Canada in 1997 and she is currently a professor of Computer Science at the University of Saskatchewan.
Towards Social Learning Environments
We are teaching a new generation of students, who have been cradled in technologies, communication and abundance of information. As a result, the design of learning technologies needs to focus on supporting social learning in context. Instead of designing technologies that "teach" the learner, the new social learning technologies will perform three main roles:
1) support the learner in finding the right content (right for the context, for the particular learner, for the specific purpose of the learner, right pedagogically);
2) support learners to connect with the right people (right for the context, learner, purpose, educational goal etc.), and
3) motivate / incentivize people to learn. In the pursuit of such environments, new areas of sciences become relevant as a source of methods and techniques: social psychology, economic / game theory, multi-agent systems.
This talk gives an overview of recent research carried out at the MADMUC lab at the University of Saskatchewan, which illustrates how social learning technologies can be designed using some existing and emerging technologies: ontologies vs. social tagging, exploratory search, collaborative vs. self-managed social recommendations, trust and reputation mechanisms, mechanism design and social visualization.
Tuesday
March 31, 12:15-13:05, room TH201
Julita Vassileva

Since September 2005, Dr. Vassileva is the NSERC/Cameco Chair of Women in Science and Engineering at the Prairies.
Her goal as a Chair is to increase the participation of women in science and engineering and to provide role models for women active in and considering careers in these fields.

She also serves currently the ACM-W Ambassador for Canada.
Girls and Science: in the World and on the Prairies
This talk starts with an overview of the results of a project which investigated the attitudes towards science and technology of girls and boys from countries around the world. The findings are related to the different participation of women in science and engineering in different countries. Differences in horizontal, vertical and geographical representation will be discussed.
The second part of the talk will focus on the representation of women in science, engineering and technology on the Prairies. The specifics of the region, including the demographics of our young generation will be outlined. Several science outreach programs, initiated by the NSERC/Cameco Chair for Women in Science and Engineering will be presented: Saskatchewan Science Network, The Science Ambassadors, myWISEmentor, ourWISEtales.
Tuesday
April 7, 10:00-10:50, room D633
Osmar Zaiane
McCalla Professor in Computing Science at the University of Alberta, Edmonton.
Scientific Director of the Alberta Ingenuity Centre for Machine Learning.
On Associative Classifiers: Achievements and Potential
There are countless paradigms and strategies for devising a classifier. Associative classifiers use association rules as a model and are a very new approach to rule-based classification. While they are still not as accurate as other approaches, they have many advantages and certainly potential in many applications.
We will briefly introduce associative classifiers, discuss their main three phases: rule generation, rule pruning and rule selection, and highlight the differences between the suggested strategies.
We will also present our current research work targeting theses three individual phases to improve the effectiveness of such classifiers.
Wednesday
April 15, 12:00-12:50, room D634
Behruz Tayfeh Rezaie
Associate Professor at the School of Mathematics of the Institute for Research in Fundamental Sciences (IPM) in Teheran, Iran.
Spectral characterization of graphs.
Two nonisomorphic graphs with the same adjacency spectrum (the multiset of eigenvalues) are called cospectral. A graph is said to be determined by the spectrum (DS for short) if it is not cospectral with any other graph. It is conjectured that almost all graphs are DS. The motivation for studying DS graphs comes from the graph isomorphism problem which is one of the most important open problems in graph theory and computer science. In this talk, I will give a survey of known families of DS graphs. A number of methods for constructing cospectral graphs are also discussed.


 
Saturday, March 28 SIXTH COMBINATORICS DAY



FALL 2008

Mondays, 12:00 to 12:50 - Room C674 (University Hall)
 
Dec.01 Cheryl Praeger,
Professor, University of Western Australia.
Member of the Order of Australia, Fellow of the Australian Academy of Science, former president of the Australian Mathematical Society
(for more, see biography (from Women Mathematicians))

Research interests: Prof. Praeger heads the research group in Group Theory and Combinatorics. Her research interests are Groups (especially permutation groups and group algorithms), Graphs, Finite Geometries, and Combinatorial Designs.
The random revolution: how statistics and complexity theory have partnered more traditional mathematics to achieve smarter computation.
Advances in modern computational power have led to dramatic changes in mathematics, including solutions to problems in large complex systems, previously considered infeasible. Much of this is due to the use of randomised algorithms. and this is no less true in algebra. The talk describes how powerful Monte Carlo methods developed in various areas of science, especially statistics, have transformed computational algebra, underpinned by the power of the finite simple group classification. The speaker is an algebraist, and several examples from her area will be given as illustrations.

Everybody is welcome !


Previous years:
2007-2008 - 2005 - 2004 - 2003 - 2002 - 2001

Contact: Habiba Kadiri (kadiri@cs.uleth.ca)