CGL Meeting Agenda

Wednesday, April 5, 1995


Location:
DC 1304
Time:
1:30 PM
Chair:
Haroon Sheikh

1. Adoption of the agenda - additions/deletions thereto.

2. Coffee Hour and Next Meeting:

Coffee hour this week:
???
Coffee hour next week:
???

Next Meeting
Time:
April 12, 1995
Location:
DC 1304
Chair:
Jay Steele
Tech Presentation:
Stephen Mereu
Forthcoming: (list next 4 and trades)
Chairs:
Riston Tapp
Greg Veres
Bill Wallace
Julie Waterhouse
Tech presentations:
Andrew Park
Glenn Paulley
Randall Reid
Haroon Sheikh

3. Technical Presentation:

Presenter:
Mike McCool
Title:
Abstract:

4. General Discussion Items:

5. Action List (remember to update AL_active)

6. Directors Meeting:

7. Seminar(s):


DEPARTMENT OF COMPUTER SCIENCE
UNIVERSITY OF WATERLOO
SEMINAR ACTIVITIES

MASTER'S THESIS PRESENTATION

                    -Friday, April 7, 1995

Alain  Gaudrault,  graduate  student, Dept. Comp. Sci.,
Univ.  Waterloo,  will  speak  on  "Load Balancing in a
Distributed Virtual Reality Environment."

TIME:                10:30-12:30 p.m.

ROOM:                DC 1302

ABSTRACT

The  WAterloo  Virtual  Environment  System (WAVES) has
been  designed  to  provide virtual reality application
developers with the tools to create distributed virtual
worlds    without    concern    for    the   underlying
implementation.  The need to distribute tasks generated
by  virtual  reality  software  becomes  a necessity if
virtual   worlds   are  to  scale  up  gracefully.   By
incorporating  load  balancing techniques within WAVES,
virtual  worlds  of  great  size  and  fidelity  may be
possible.  In this paper, load balancing approaches are
reviewed,   and   an   algorithm   is   developed   and
implemented.    Simulation   software   to   test   the
efficiency   of   the   load   balancing  algorithm  is
described,   and   results  of  the  testing  phase  is
included.   Future  development  of  both WAVES and the
load balancing scheme is discussed.

April 10.  Davis Centre 1331.  Noon.
Graeme Hirst, Computer Science Department, University of Toronto.
Existence assumptions in knowledge representation

Abstract.  If knowledge representation formalisms are to be suitable for
semantic
interpretation of natural language, they must be more adept with
representations of existence and non-existence than they presently
are.  Quantifiers must sometimes scope over non-existent entities.  I
review the philosophical background, including Anselm and Kant, and
exhibit some ontological problems that natural language sentences pose
for knowledge representation.  The paraphrase methods of Russell and
Quine are unable to deal with many of the problems.

Unfortunately, the shortcomings of the Russell-Quine ontology are
reflected in most current knowledge representation formalisms in AI.
Several alternatives are considered, including some intensional
formalisms and the work of Hobbs, but all have problems.  Free logics
and possible worlds don't help either.  But useful insights are found
in the Meinongian theory of Parsons, in which a distinction between
nuclear and extranuclear kinds of predicates is made and used to define
a universe over which quantification scopes.  If this is combined with
a naive ontology, with about eight distinct kinds of existence, a
better approach to the representation of non-existence can be developed
within Hobbs's basic formalism.

Biography.  Graeme Hirst is the author of /Semantic interpretation and the
resolution of ambiguity/ (Cambridge University Press, 1987).
He serves as book review editor of /Computational Linguistics/.

                   The University of Waterloo
                      200 University Avenue
                        Waterloo, Ontario


           The Institute for Computer Research (ICR)

                    Presents a Colloquium on

                     Uncheatable Benchmarks


by:     Dr. Jin-yi Cai

of:     Department of  Computer  Science
        SUNY Buffalo
        Buffalo, New York


Date:   Wednesday,  April 12, 1995
Time:   3:30 pm.
Place:  William G. Davis Computer Research Centre, Room 1302


Abstract:

Benchmarks have been used to test everything from the speed of  a
processor,  to the accuracy of computations, to the capacity of a
memory system. The computing community relies heavily on the  use
of  benchmarks  to  assess  how well a given hardware or software
system operates. The recent fiasco involving the Pentium chips is
an  excellent  reminder  that while certain computations may seem
elementary, adequate testing using good benchmarks is  indispens-
able.

Up until now, the study of the art of designing a good  benchmark
has  focused  on making the benchmark ``realistic'' in predicting
how well it will perform for the intended applications; the issue
of  making  benchmark  results  trustworthy has been relegated to
``trusted'' or third party agents,  and little attention has been
paid  to  the question of making benchmarks themselves ``uncheat-
able''.

We wish to study the problem  of  how  one  can  make  benchmarks
resistant  to  tampering and hence more trustworthy. We propose a
framework based on modern  cryptography  and  complexity  theory.
Concrete  schemes are proposed for different benchmarks: speed of
the processor, memory capacity, sorting,  etc.   They  are  ``un-
cheatable'',  if  certain  complexity theory assumptions are true
based on the hardness of factoring and discrete logarithm.

We also present a scheme for matrix multiplication which  depends
on  the numerical instability of certain floating point arithmet-
ic.  If time permits we will present a  new  scheme  which  could
have captured the errors in the Pentium chip.

Part of this talk will be based on joint work with R. Lipton,  R.
Sedgewick  and A. Yao, and also joint work with my students S. Ar
and  A.  Nerurkar.  Supported  by  NSF   grant   CCR9319393   and
CCR9057486, and a Sloan fellowship.


Everyone is welcome.  Refreshments served.

8. Lab Cleanup (till 2:30pm or 5 minutes, whichever is longer)