BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Date iCal//NONSGML kigkonsult.se iCalcreator 2.20.2//
METHOD:PUBLISH
X-WR-CALNAME;VALUE=TEXT:Eventi DIAG
BEGIN:VTIMEZONE
TZID:Europe/Paris
BEGIN:STANDARD
DTSTART:20251026T030000
TZOFFSETFROM:+0200
TZOFFSETTO:+0100
TZNAME:CET
END:STANDARD
BEGIN:DAYLIGHT
DTSTART:20250330T020000
TZOFFSETFROM:+0100
TZOFFSETTO:+0200
TZNAME:CEST
END:DAYLIGHT
END:VTIMEZONE
BEGIN:VEVENT
UID:calendar.29382.field_data.0@oba.diag.uniroma1.it
DTSTAMP:20260404T193128Z
CREATED:20250628T145248Z
DESCRIPTION:Abstract: Neural networks have shown great promise across geosc
 ientific applications\, yet their complex\, nonlinear nature often hinders
  interpretability and limits scientific insight and model trust. Explainab
 le AI (XAI) methods aim to attribute a model's prediction to specific inpu
 t features\, but their evaluation typically relies on image-based benchmar
 k datasets like MNIST or ImageNet\, which are lacking objective ground tru
 th for attribution.In this work\, we introduce a framework to generate ben
 chmark datasets with known attribution ground truth\, using additively sep
 arable functions. We construct a large synthetic dataset\, train fully con
 nected networks to learn the target functions\, and evaluate the performan
 ce of various XAI methods by comparing their attribution maps to the known
  ground truth. Our results highlight when and where specific XAI methods s
 ucceed or fail. This benchmark approach provides a much-needed foundation 
 for rigorous\, objective assessment of XAI tools in the geosciences\, pavi
 ng the way for more trustworthy models and deeper scientific discovery.Sho
 rt Bio: Professor Antonios Mamalakis is an environmental data scientist wh
 ose research focuses on applying statistical methods\, machine learning\, 
 deep learning\, and explainable AI to tackle key challenges in environment
 al science. His work explores topics such as hydroclimate prediction\, cli
 mate teleconnections\, causal discovery\, and climate change impacts. Befo
 re joining the University of Virginia\, he was a research scientist at Col
 orado State University\, where he led efforts to evaluate explainable AI t
 ools in geosciences. His research has been featured in high-impact journal
 s such as Nature Communications\, Nature Climate Change\, and Geophysical 
 Research Letters. Professor Mamalakis holds a Ph.D. in Civil and Environme
 ntal Engineering from the University of California\, Irvine\, and serves a
 s Associate Editor for the AMS journal Artificial Intelligence for the Ear
 th Systems.
DTSTART;TZID=Europe/Paris:20250704T100000
DTEND;TZID=Europe/Paris:20250704T100000
LAST-MODIFIED:20250630T065401Z
LOCATION:Aula Magna\, DIAG
SUMMARY:AI Attribution Benchmarks for Geosciences: Are We Gaining the Right
  Insights from Explainable AI? - Professor Antonios Mamalakis
URL;TYPE=URI:http://oba.diag.uniroma1.it/node/29382
END:VEVENT
END:VCALENDAR
