Colloquia and Seminars

To join the email distribution list of the cs colloquia, please visit the list subscription page.


Computer Science events calendar in HTTP ICS format for of Google calendars, and for Outlook.

Academic Calendar at Technion site.

Upcoming Colloquia & Seminars

  • Search for Smart Evaders with UAV Swarms

    Speaker:
    Roee Francos, M.Sc. Thesis Seminar
    Date:
    Sunday, 26.1.2020, 11:00
    Place:
    Taub 401 Taub Bld.
    Advisor:
    Prof. A. Bruckstein

    Suppose that in a given planar circular region, there are some smart mobile evaders and we would like to find them using swarms of sweeping agents. A smart evader is a target that detects and responds to the motions of searchers by performing evasive maneuvers, to avoid interception. We assume various search configurations for the sweeping swarm of agents, and present guaranteed search techniques for single agent and multi agent swarms. These search procedures enable both confinement of smart evaders to their original domain as well as complete detection of all evaders by searching the entire expanding domain.

  • Theory Seminar: Strong Average-Case Circuit Lower Bounds from Non-trivial Derandomization

    Speaker:
    Lijie Chen (MIT)
    Date:
    Wednesday, 29.1.2020, 12:30
    Place:
    Taub 201 Taub Bld.

    We prove that for all constants a, NQP = NTIME[n^polylog(n)] cannot be (1/2 + 2^(-log^a n) )-approximated by 2^(log^a n)-size ACC^0 circuits. Previously, it was even open whether E^NP can be (1/2+1/sqrt(n))-approximated by AC^0[2] circuits. As a straightforward application, we obtain an infinitely often non-deterministic pseudorandom generator for poly-size ACC^0 circuits with seed length 2^{log^eps n}, for all eps > 0.

    More generally, we establish a connection showing that, for a typical circuit class C, non-trivial nondeterministic CAPP algorithms imply strong (1/2 + 1/n^{omega(1)}) average-case lower bounds for nondeterministic time classes against C circuits. The existence of such (deterministic) algorithms is much weaker than the widely believed conjecture PromiseBPP = PromiseP.

    Our new results build on a line of recent works, including [Murray and Williams, STOC 2018], [Chen and Williams, CCC 2019], and [Chen, FOCS 2019]. In particular, it strengthens the corresponding (1/2 + 1/polylog(n))-inapproximability average-case lower bounds in [Chen, FOCS 2019]. The two important technical ingredients are techniques from Cryptography in NC^0 [Applebaum et al., SICOMP 2006], and Probabilistic Checkable Proofs of Proximity with NC^1-computable proofs.

    This is joint work with Hanlin Ren from Tsinghua University.

  • Pixel Club :Learning-Based Strong Solutions to Forward and Inverse Problems in Partial Differential Equations

    Speaker:
    Lea Bar (Tel Aviv University)
    Date:
    Tuesday, 18.2.2020, 11:30
    Place:
    Electrical Eng. Building 1061

    We introduce a novel neural network-based partial differential equations solver for forward and inverse problems. The solver is grid free, mesh free and shape free, and the solution is approximated by a neural network.

    We employ an unsupervised approach such that the input to the network is a points set in an arbitrary domain, and the output is the set of the corresponding function values. The network is trained to minimize deviations of the learned function from the PDE solution and satisfy the boundary conditions.

    The resulting solution in turn is an explicit smooth differentiable function with a known analytical form.

    Unlike other numerical methods such as finite differences and finite elements, the derivatives of the desired function can be analytically calculated to any order. This framework therefore, enables the solution of high order non-linear PDEs. The proposed algorithm is a unified formulation of both forward and inverse problems where the optimized loss function consists of few elements: fidelity terms of L2 and L infinity norms, boundary and initial conditions constraints, and additional regularizers. This setting is flexible in the sense that regularizers can be tailored to specific problems. We demonstrate our method on several free shape 2D second order systems with application to Electrical Impedance Tomography (EIT), diffusion and wave equations.

    Short bio:
    Leah Bar holds B.Sc. in Physics, M.Sc. in Bio-Medical Engineering and PhD in Electrical Engineering from Tel-Aviv University. She worked as a post-doctoral fellow in the Department of Electrical Engineering at the University of Minnesota. She is currently a senior researcher at MaxQ-AI, a medical AI start-up, and in addition a researcher at the Mathematics Department in Tel-Aviv University. Her research interest are: machine learning, image processing, computer vision and variational methods. more details» copy to my calendar

  • Hypernetworks and a New Feedback Model

    Speaker:
    Lior Wolf - COLLOQUIUM LECTURE
    Date:
    Tuesday, 31.3.2020, 14:30
    Place:
    Room 337 Taub Bld.
    Affiliation:
    School of Computer Science, Tel-Aviv University
    Host:
    Yuval Filmus

    Hypernetworks, also known as dynamic networks, are neural networks in which the weights of at least some of the layers vary dynamically based on the input. Such networks have composite architectures in which one network predicts the weights of another network. I will briefly describe the early days of dynamic layers and present recent results from diverse domains: 3D reconstruction from a single image, image retouching, electrical circuit design, decoding block codes, graph hypernetworks for bioinformatics, and action recognition in video. Finally, I will present a new hypernetwork-based model for the role of feedback in neural computations. Short Bio: ========== Lior Wolf is a faculty member at the School of Computer Science at Tel Aviv University and a research scientist at Facebook AI Research. Before, he was a postdoc working with Prof. Poggio at CBCL, MIT. He received his PhD working with Prof. Shashua at the Hebrew U, Jerusalem. ====================================== Refreshments will be served from 14:15 Lecture starts at 14:30

  • Second-order Optimization for Machine Learning, Made Practical

    Speaker:
    Tomer Koren - COLLOQUIUM LECTURE
    Date:
    Tuesday, 5.5.2020, 14:30
    Place:
    Room 337 Taub Bld.
    Affiliation:
    School of Computer Science at Tel Aviv University
    Host:
    Yuval Filmus

    Optimization in machine learning, both theoretical and applied, is presently dominated by first-order gradient methods such as stochastic gradient descent. Second-order optimization methods---that involve second-order derivatives and/or second-order statistics of the data---have become far less prevalent despite strong theoretical properties, due to their impractical computation, memory and communication costs. I will present some recent theoretical, algorithmic and infrastructural advances that allow for overcoming these challenges in using second-order methods and obtaining significant performance gains in practice, at very large scale, and on highly-parallel computing architectures. Short Bio: =========== Tomer Koren is an Assistant Professor in the School of Computer Science at Tel Aviv University since Fall 2019. Previously, he was a Senior Research Scientist at Google Brain, Mountain View. He received his PhD in December 2016 from the Technion - Israel Institute of Technology, where his advisor was Prof. Elad Hazan. His research interests are in machine learning and optimization. ===================================== Rereshments will be served from 14:15 Lecture starts at 14:30