Understanding Evolvability in Biological Systems

on the power and the limits of evolvability l.w
1 / 19
Embed
Share

Delve into the concept of evolvability, exploring its power and limits through examples and models such as PAC learners and evolution algorithms. Learn how evolution algorithms, selection fitness, and mutation algorithms contribute to the evolvability of functions. Discover how feedback is restricted by a polynomial number of samples in the context of learning algorithms.

  • Evolvability
  • Biology
  • PAC Learner
  • Evolution Algorithm
  • Selection Fitness

Uploaded on | 0 Views


Download Presentation

Please find below an Image/Link to download the presentation.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. Download presentation by click this link. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.

E N D

Presentation Transcript


  1. On the power and the limits of evolvability Vitaly Feldman Almaden Research Center

  2. Learning from examples vs evolvability Learnable from examples Parity functions Evolvable

  3. The core model: PAC [Valiant 84] Learner observes random examples: (?,?(?)) Assumption: unknown Boolean function ?:? { 1,1} labeling the examples comes from a known class of functions ? Distribution ? over ? (e.g. ?? or 1,1?) every distribution ? For every ? ?,? > 0, w.h.p. output :? [ 1,1] s.t. E ? ?? ? ? For Boolean Pr ? ?? ? = ? Efficient: poly(1 1 ? 1 ?/2 ?, ? ) time

  4. Classical example - - - ? halfspaces/linear threshold functions o ????( ????? ?) for ?1,?2, ,??,? ? o equivalent to examples being linearly separable - - - - - - - - - Perceptron algorithm [Rosenblatt 57; Block 62; Novikoff 62] o Start with LTF 0 defined by ?0= 0,0, ,0 ;?0= 0 o Get a random example (?, ). If ?? = do nothing o Else let ?+1 be LTF defined by ??+1= ??+ ? ;??+1= ??+ Gives PAC learning if ? has significant margin ? on the observed data points + ++ + + + + + + + + +

  5. Evolution algorithm ? - representation class of functions over ? o E.g. all linear thresholds over ?? ? - randomized mutation algorithm that given ? ? outputs (a random mutation) ? ? o Efficient: poly in 1 o E.g. choose a random ? and adjust ?? by 0, +1 or -1 randomly Hypothesis Mutation algorithm ?,?

  6. Selection Fitness ?????(?,?) [ 1,1] o Correlation: ??? ? ? ? For ? ? sample ?(?) ? times: ?1,?2, ,?? Estimate empirical performance of ? and each ??using ? samples: ??????,?? Natural selection ??????,? ? and ? are feasible (polynomial in ?,1/?) 1 -1 Empirical performance ? ?

  7. Evolvability Class of functions ? is evolvable over ? if exists an evolution algorithm (?,?) and a polynomial ?( , ) s.t. For every ? ?,? ?, >0 and a sequence ?0= ?,?1,?2, where??+1 Select(?,?,??) it holds: ?????(?,?? ?,1 ) 1 w.h.p. ? Evolvable (distribution-independently) Evolvable for all ? by the same mutation algorithm (?,?)

  8. Limits of evolvability Feedback is restricted to values of polynomial number of samples ? ??????,?? for some 1 ?1 2 ?2 ? ?? learning algorithm ?????(?) oracle ??= ??????, ? evaluated on ? fresh examples

  9. Evolvable CSQ learnable PAC learnable Learnable with oracle = CSQ learnable ???? Evolvable Correlational Statistical Query: To query CSQ oracle responds with any value ? ? ??? ? (?) for ? 1 ???? ?,1 ? Learning by Distances [Ben-David,Itai,Kushilevitz 90] Restriction of SQ model by Kearns [93]

  10. CSQ learnable Evolvable [F. 08] PAC learnable Learnable with oracle = CSQ learnable ???? Evolvable

  11. Proof outline Replace queries for performance values with approximate comparisons For hypothesis :? 1,1 , tolerance ? > 0 and threshold ? ?, CSQ>oracle returns: 1 if ??? ? ? 0 if ??? ? ? 0 or 1 otherwise ? + ? ? ? Design evolution algorithm with mutations that simulate comparison queries

  12. From comparisons to mutations For hypothesis :? 1,1 , tolerance ? > 0 and threshold ? ?, CSQ>oracle returns: 1 if ??? ? ? 0 if ??? ? ? 0 or 1 otherwise ? + ? ? ? 1 ? ?0 0 Beneficial/neutral threshold = ? log(1 ?) Mutation pool size ? = ? ? 0 ? log(1 ?) Performance sample size ? = ? ?1 ?2 ? With probability at least 1 ?, Select ? = ?? where ? is valid response to comparison query

  13. Simulating ???> algorithm Need to answer ? queries ?? is the function obtained after answering ? queries Need to answer query ? with threshold ?? 1 ? ?0 ?? Beneficial/neutral threshold = ?? log(1 ?) Mutation sample size ? = ? ? ?? ? log(1 ?) Performance sample size ? = ? ?1 ??+ ? ?2 ? Leads to representations with values in [ ?,?]! Rescale by 1/? to get functions with values in [-1,1] Given answers to queries can compute such that ??????, 1 ? and mutate into it

  14. CSQ learnable Evolvable [F.08; F. 09] PAC learnable CSQ learnable Evolvable E.g. optimizing selection; recombination [Kanade 11]; changing thresholds; number of mutations

  15. How powerful is CSQ learning? SQ learning [Kearns 93] : learner submits ?(?, ) SQ oracle returns ? such that ? ??? ?,?(?) Many known algorithms are essentially described in SQ or can be easily translated to SQ o Boolean conjunctions, decision lists, simple geometric concepts, ??0 [Kearns 93] Several other were found with more effort o Halfspaces with large margin [Bylander 94] o General LTFs [BlumFrie.Kann.Vemp. 96; Duna.Vemp. 04] General ML techniques o Nearest neighbor o Gradient descent o SVM o Boosting

  16. Perceptron in SQ Recall the Perceptron algorithm: o Add false negatives, subtract false positive examples - - - False positive - - - - - - - - - + ++ + ? + + + + + + + + ? Use SQs to find the centroid of false positives ??[?1 ? ? ? = 1 ?( ?(?) = 1)] gives the first coordinate of the centroid Use the centroid for the Perceptron update

  17. If ? is fixed then SQCSQ Decompose SQ into CSQ and a constant CSQ Corollary: linear threshold functions are evolvable for any fixed distribution ?

  18. Distribution-independent CSQ Single points are learnable [F. 09] Characterization of weak-learning [F. 08] 1 Better than random guessing: E ? ?? ? ? ???? ?,1 ? ? is weakly CSQ learnable if and only if all functions in ? can be represented as linear threshold functions with significant margin over a poly-size set of Boolean features General linear thresholds are not weakly CSQ learnable [Goldmann,Hastad,Razborov 95] (but are SQ learnable) Conjunctions are not CSQ learnable [F. 11]

  19. Further directions Characterize (strong) evolvability (CSQ learning) o Strengthen the lower bound for conjunctions Are thresholds on a line evolvable distribution independently 2 then all of SQ is ??????,? = ?? evolvable [F. 09] ? ? ? ?

Related


More Related Content