Table of Contents
Fetching ...

MetaGreen: Meta-Learning Inspired Transformer Selection for Green Semantic Communication

Shubhabrata Mukherjee, Cory Beard, Sejun Song

TL;DR

This work lays the foundation for energy-efficient model selection and the development of green semantic communication, by cumulatively applying EOSL, which enables the model selection system to adapt to this change, leveraging historical EOSL values to guide the learning process.

Abstract

Semantic Communication can transform the way we transmit information, prioritizing meaningful and effective content over individual symbols or bits. This evolution promises significant benefits, including reduced latency, lower bandwidth usage, and higher throughput compared to traditional communication. However, the development of Semantic Communication faces a crucial challenge: the need for universal metrics to benchmark the joint effects of semantic information loss and energy consumption. This research introduces an innovative solution: the ``Energy-Optimized Semantic Loss'' (EOSL) function, a novel multi-objective loss function that effectively balances semantic information loss and energy consumption. Through comprehensive experiments on transformer models, including energy benchmarking, we demonstrate the remarkable effectiveness of EOSL-based model selection. We have established that EOSL-based transformer model selection achieves up to 83\% better similarity-to-power ratio (SPR) compared to BLEU score-based selection and 67\% better SPR compared to solely lowest power usage-based selection. Furthermore, we extend the applicability of EOSL to diverse and varying contexts, inspired by the principles of Meta-Learning. By cumulatively applying EOSL, we enable the model selection system to adapt to this change, leveraging historical EOSL values to guide the learning process. This work lays the foundation for energy-efficient model selection and the development of green semantic communication.

MetaGreen: Meta-Learning Inspired Transformer Selection for Green Semantic Communication

TL;DR

This work lays the foundation for energy-efficient model selection and the development of green semantic communication, by cumulatively applying EOSL, which enables the model selection system to adapt to this change, leveraging historical EOSL values to guide the learning process.

Abstract

Semantic Communication can transform the way we transmit information, prioritizing meaningful and effective content over individual symbols or bits. This evolution promises significant benefits, including reduced latency, lower bandwidth usage, and higher throughput compared to traditional communication. However, the development of Semantic Communication faces a crucial challenge: the need for universal metrics to benchmark the joint effects of semantic information loss and energy consumption. This research introduces an innovative solution: the ``Energy-Optimized Semantic Loss'' (EOSL) function, a novel multi-objective loss function that effectively balances semantic information loss and energy consumption. Through comprehensive experiments on transformer models, including energy benchmarking, we demonstrate the remarkable effectiveness of EOSL-based model selection. We have established that EOSL-based transformer model selection achieves up to 83\% better similarity-to-power ratio (SPR) compared to BLEU score-based selection and 67\% better SPR compared to solely lowest power usage-based selection. Furthermore, we extend the applicability of EOSL to diverse and varying contexts, inspired by the principles of Meta-Learning. By cumulatively applying EOSL, we enable the model selection system to adapt to this change, leveraging historical EOSL values to guide the learning process. This work lays the foundation for energy-efficient model selection and the development of green semantic communication.

Paper Structure

This paper contains 13 sections, 18 equations, 9 figures, 7 tables.

Figures (9)

  • Figure 1: The basic blocks of a semantic communication
  • Figure 2: Evolution of complexity and training requirement of models
  • Figure 3: Effect of semantic noise during semantic transformation
  • Figure 4: End to end image/text transformer based SemCom with communication channel
  • Figure 5: Resource Utilization During Inference
  • ...and 4 more figures