Microcontrollers typically work at one working (system, bus) frequency all the time, with stopping the clocks altogether when power saving is needed. However, sometimes there's a requirement to change the working clock frequency, for whatever reason - it may be sustained function while saving power, or an emergency situation when a primary clock source (e.g. crystal oscillator) fails and a backup clock (e.g. internal RC oscillator) is brought in.
It may sound obvious that in such situation those peripherals, which functionality depends primarily on the system clock frequency, e.g. UART. However, as this is a seldom used operation, there may be less than obvious sources of error or confusion in way.
For example, if "libraries" such as SPL or Cube are used for the setup, the system clock frequency is sometimes calculated from a value which is usually given by a #define as a constant, so it may require some understanding of the involved mechanisms to be able to coax the usual setup routines to use a variable instead.
Some routines may be written so that they assume peripherals' control registers to be at their reset value (e.g. using RMW operations such as |= or &= to modify only certain bits of a bitfield in a register); these may need to be rewritten for correct operation.
Some peripherals don't allow changing timing-related registers while operating, or such change may cause the peripheral to end up in an unknown, non- or partially-functioning state. Such peripheral may then require a deeper readjustment, maybe requiring a complete reset (possibly employing the respective RCC_AxBRSTRx bit) and a new setup.
Also, peripherals, which are clocked from external clock (e.g. TIM in external clock mode, or I2C in slave mode), still may employ internally clocked filters on the signals, which may need readjustment.