Browse Source

sbc: do not set sample format in parser

Commit bdd31feec9 changed the SBC decoder to only set the output
sample format on init, instead of setting it explicitly on each frame,
which is correct. But the SBC parser overrides the sample format to S16,
which triggers a crash when combining the parser and the decoder.

Fix the issue by not setting the sample format anymore in the parser,
which is wrong.

Signed-off-by: James Almer <jamrial@gmail.com>
tags/n4.4
Arnaud Vrac James Almer 4 years ago
parent
commit
29993b2947
1 changed files with 0 additions and 2 deletions
  1. +0
    -2
      libavcodec/sbc_parser.c

+ 0
- 2
libavcodec/sbc_parser.c View File

@@ -42,7 +42,6 @@ static int sbc_parse_header(AVCodecParserContext *s, AVCodecContext *avctx,

if (data[0] == MSBC_SYNCWORD && data[1] == 0 && data[2] == 0) {
avctx->channels = 1;
avctx->sample_fmt = AV_SAMPLE_FMT_S16;
avctx->sample_rate = 16000;
avctx->frame_size = 120;
s->duration = avctx->frame_size;
@@ -66,7 +65,6 @@ static int sbc_parse_header(AVCodecParserContext *s, AVCodecContext *avctx,
+ (joint * subbands)) + 7) / 8;

avctx->channels = channels;
avctx->sample_fmt = AV_SAMPLE_FMT_S16;
avctx->sample_rate = sample_rates[sr];
avctx->frame_size = subbands * blocks;
s->duration = avctx->frame_size;


Loading…
Cancel
Save