Coherency Loss for Hierarchical Time Series Forecasting
In hierarchical time series forecasting, some series are aggregated from others, producing a known coherency metric between series. We present a new method for enforcing coherency on hierarchical time series forecasts. We propose a new loss function, called Network Coherency Loss, that minimizes the...
Main Author: | |
---|---|
Other Authors: | |
Format: | Thesis |
Published: |
Massachusetts Institute of Technology
2024
|
Online Access: | https://hdl.handle.net/1721.1/156799 |
_version_ | 1811096834364407808 |
---|---|
author | Hensgen, Michael Lowell |
author2 | Perakis, Georgia |
author_facet | Perakis, Georgia Hensgen, Michael Lowell |
author_sort | Hensgen, Michael Lowell |
collection | MIT |
description | In hierarchical time series forecasting, some series are aggregated from others, producing a known coherency metric between series. We present a new method for enforcing coherency on hierarchical time series forecasts. We propose a new loss function, called Network Coherency Loss, that minimizes the coherency loss of the weight and bias of the final linear layer of a neural network. We compare it against a baseline without coherency and a state of the art method that uses projection to strictly enforce coherency. We find that, by choosing our Network Coherency Loss parameters based on validation data, for four datasets of varying sizes we produce improved accuracy over our two benchmark models. We also find that, when compared to an alternative loss function also designed to produce coherency, our Network Coherency Loss function produces similar accuracies but improves the coherency on the test data. |
first_indexed | 2024-09-23T16:49:46Z |
format | Thesis |
id | mit-1721.1/156799 |
institution | Massachusetts Institute of Technology |
last_indexed | 2024-09-23T16:49:46Z |
publishDate | 2024 |
publisher | Massachusetts Institute of Technology |
record_format | dspace |
spelling | mit-1721.1/1567992024-09-17T03:02:09Z Coherency Loss for Hierarchical Time Series Forecasting Hensgen, Michael Lowell Perakis, Georgia Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science In hierarchical time series forecasting, some series are aggregated from others, producing a known coherency metric between series. We present a new method for enforcing coherency on hierarchical time series forecasts. We propose a new loss function, called Network Coherency Loss, that minimizes the coherency loss of the weight and bias of the final linear layer of a neural network. We compare it against a baseline without coherency and a state of the art method that uses projection to strictly enforce coherency. We find that, by choosing our Network Coherency Loss parameters based on validation data, for four datasets of varying sizes we produce improved accuracy over our two benchmark models. We also find that, when compared to an alternative loss function also designed to produce coherency, our Network Coherency Loss function produces similar accuracies but improves the coherency on the test data. M.Eng. 2024-09-16T13:49:59Z 2024-09-16T13:49:59Z 2024-05 2024-07-11T14:37:01.788Z Thesis https://hdl.handle.net/1721.1/156799 In Copyright - Educational Use Permitted Copyright retained by author(s) https://rightsstatements.org/page/InC-EDU/1.0/ application/pdf Massachusetts Institute of Technology |
spellingShingle | Hensgen, Michael Lowell Coherency Loss for Hierarchical Time Series Forecasting |
title | Coherency Loss for Hierarchical Time Series Forecasting |
title_full | Coherency Loss for Hierarchical Time Series Forecasting |
title_fullStr | Coherency Loss for Hierarchical Time Series Forecasting |
title_full_unstemmed | Coherency Loss for Hierarchical Time Series Forecasting |
title_short | Coherency Loss for Hierarchical Time Series Forecasting |
title_sort | coherency loss for hierarchical time series forecasting |
url | https://hdl.handle.net/1721.1/156799 |
work_keys_str_mv | AT hensgenmichaellowell coherencylossforhierarchicaltimeseriesforecasting |