刀塔传奇新区开放时间要开nuplayer吗

经验2614 米
在线时间256 小时
MIUI9开发组
积分 3539, 距离下一级还需 1461 积分
积分 3539, 距离下一级还需 1461 积分
机型小米手机4c
双清刷完新版ROM,默认设置情况下,开发者选项--使用NuPlayer&&nuplayer代替awesomeplayer,这个选项处于开启状态。
内测组说花屏的解决办法是打开NuPlayer选项,百度上搜索答案是NuPlayer会卡,awesomeplayer更流畅,大家说说该选哪个好呢?
单选投票, 共有 1456 人参与投票
1. &nuplayer
2. &awesomeplayer
您所在的用户组没有投票权限
分享到微信朋友圈
打开微信,点击底部的“发现”,使用 “扫一扫” 即可将网页分享到我的朋友圈。
经验3896 米
在线时间290 小时
积分 4444, 距离下一级还需 556 积分
积分 4444, 距离下一级还需 556 积分
机型小米手机6
签到次数10
MIUI版本7.9.8
通过手机发布
AW兼容性更好一点,有问题的话切成AW就行,没问题不用管它
经验6847 米
在线时间180 小时
版本6.12.15
积分 7922, 距离下一级还需 12078 积分
积分 7922, 距离下一级还需 12078 积分
机型小米手机6
签到次数37
MIUI版本6.12.15
因为用要v4a,所以果断不用n的
经验2211 米
在线时间91 小时
版本V8.5.1.0.LHJCNED
积分 2444, 距离下一级还需 2556 积分
积分 2444, 距离下一级还需 2556 积分
机型红米手机2 移动增强版 / 红米手机2A 高配版
签到次数28
MIUI版本V8.5.1.0.LHJCNED
不懂,完全不懂
经验1120 米
在线时间10 小时
版本V8.2.26.0.NCACNEC
积分 1285, 距离下一级还需 715 积分
积分 1285, 距离下一级还需 715 积分
机型小米手机6
签到次数22
MIUI版本V8.2.26.0.NCACNEC
不用知道是干啥的
经验8582 米
在线时间109 小时
版本7.11.16
积分 9416, 距离下一级还需 10584 积分
积分 9416, 距离下一级还需 10584 积分
机型红米手机1S WCDMA 3G版
签到次数61
MIUI版本7.11.16
经验1340 米
在线时间178 小时
版本V7.0.16.0.LXKCNCI
积分 1807, 距离下一级还需 193 积分
积分 1807, 距离下一级还需 193 积分
机型小米手机4c
MIUI版本V7.0.16.0.LXKCNCI
经验445 米
在线时间0 小时
版本7.2.16
积分 470, 距离下一级还需 30 积分
积分 470, 距离下一级还需 30 积分
机型小米手机4c
签到次数28
MIUI版本7.2.16
俩个分别是啥意思
经验10115 米
在线时间295 小时
版本7.11.27
机型小米手机6
签到次数191
MIUI版本7.11.27
一直是默认的Nu。。。
经验1550 米
在线时间18 小时
版本7.11.16
机型小米手机6
签到次数99
MIUI版本7.11.16
“澎湃S1 ”芯片纪念勋章
参与活动回帖可得
MIUI 300周
MIUI 300周更新纪念勋章
已关注微信
已关注极客秀微信
关注新浪微博
已关注新浪微博
MIUI六周年
MIUI六周年纪念勋章
APP 1000万
MIUI论坛APP注册用户突破1000万纪念勋章
小火箭勋章
神舟11号 话题活动
MIUI 7纪念勋章
MIUI五周年
MIUI五周年纪念勋章
关注腾讯微博
已关注腾讯微博
MIUI 3000万
MIUI 3000万发烧友纪念勋章
MIUI 2000万
MIUI 2000万发烧友纪念勋章
1000万用户纪念勋章
MIUI1000万用户纪念勋章
MIUI三周年
MIUI三周年纪念勋章
Copyright (C) 2017 MIUI
京ICP备号 | 京公网安备34号 | 京ICP证110507号经验6986 米
在线时间283 小时
版本7.11.23
机型小米手机5c
签到次数123
MIUI版本7.11.23
通过手机发布
这是什么功能来的,开了又有什么用
分享到微信朋友圈
打开微信,点击底部的“发现”,使用 “扫一扫” 即可将网页分享到我的朋友圈。
经验4713 米
在线时间128 小时
机型红米手机4 高配版
签到次数186
MIUI版本7.9.5
沙发,先坐了,管他有用没用。。谢谢楼主。。
经验4887 米
在线时间188 小时
版本V8.5.7.0.MCFCNED
积分 6753, 距离下一级还需 13247 积分
积分 6753, 距离下一级还需 13247 积分
机型红米Note4X 高通版
签到次数57
MIUI版本V8.5.7.0.MCFCNED
这个还不知
经验849 米
在线时间7 小时
版本7.9.21
积分 815, 距离下一级还需 1185 积分
积分 815, 距离下一级还需 1185 积分
机型小米手机5
签到次数77
MIUI版本7.9.21
nuplayer是android源生框架里的一个流媒体的播放器,是开源代码中的一部分,(不过在android-5.0,本地播放也用了nuplayer),当然这个播放器用户是看不到的,只要你播放视频,最终都会使用.......awesome存在bug,谷歌在l之后已放弃,建议l以上的版本使用nu,尽管他还需要完善 (不少用户反映,自己的设备无法正常使用视频串流服务。部分用户只是无法使用某些视频服务,也有设备无法进行任何视频串流。甚至还有用户称自己的设备无法拍摄视频。----以上问题只会在特定情况下出现,也许你一辈子也遇不到) 不用谢
万圣节勋章
参加回帖活动
MIUI七夕鹊桥勋章
MIUI 9纪念勋章
APP 1000万
MIUI论坛APP注册用户突破1000万纪念勋章
MIUI 7纪念勋章
MIUI五周年
MIUI五周年纪念勋章
已关注极客秀微信
已关注微信
关注腾讯微博
已关注腾讯微博
关注新浪微博
已关注新浪微博
MIUI3亿用户纪念勋章
参与回帖活动
感恩节勋章
参与回帖活动
米兔月饼勋章
参加回帖活动
MIUI七周年
“澎湃S1 ”芯片纪念勋章
参与活动回帖可得
1000万用户纪念勋章
MIUI1000万用户纪念勋章
Copyright (C) 2017 MIUI
京ICP备号 | 京公网安备34号 | 京ICP证110507号3被浏览8238分享邀请回答1添加评论分享收藏感谢收起0添加评论分享收藏感谢收起写回答杜比全景声闪退 怎么解决, 临时关闭SELinux 开发者选项不使用Nuplayer 以上方法都试_百度知道
色情、暴力
我们会通过消息、邮箱等方式尽快将举报结果通知您。
杜比全景声闪退 怎么解决, 临时关闭SELinux 开发者选项不使用Nuplayer 以上方法都试
杜比全景声闪退怎么解决,临时关闭SELinux开发者选项不使用Nuplayer以上方法都试过了我是从主空间里用TWRP刷入的BL锁已解
魅蓝note2安装杜比全景声闪退怎么解决临时关闭SELinux都试过了我是从手机储存里用TWRP刷入的BL锁已解
我有更好的答案
和手机自动的音效处理处理不兼容,只能用一个。
什么手机品牌可以刷入去而可用?
考虑到兼容性,目前我的手机可以两个同时刷入而互不干扰(前提是只能用一个)
采纳率:78%
来自团队:
为您推荐:
其他类似问题
换一换
回答问题,赢新手礼包Android NuPlayer要点详解_Android开发_动态网站制作指南
Android NuPlayer要点详解
来源:人气:448
本文将基于Android N对NuPlayer做一个详解。NuPlayer是Android中本地和流媒体播放所用的播放器。
1、AHandler机制
首先介绍NuPlayer中无处不在的AHandler机制
frameworks/av/include/media/stagefright/foundation/
frameworks/av/media/libstagefright/foundation/
AHandler是Android native层实现的一个异步消息机制,在这个机制中所有的处理都是异步的,将变量封装到一个消息AMessage结构体中,然后放到队列中去,后台专门有一个线程会从这个队列中取出消息然后执行,执行函数就是onMessageReceived。
Ahandler机制包括以下几个类
消息类,用于构造消息,通过post方法投递出去给ALooper
status_t AMessage::post(int64_t delayUs) {
sp&ALooper& looper = mLooper.omote();
if (looper == NULL) {
ALOGW("failed to post message as target looper for handler %d is gone.", mTarget);
return -ENOENT;
looper-&post(this, delayUs);
return OK;
void AMessage::deliver() {
sp&AHandler& handler = mHandler.promote();
if (handler == NULL) {
ALOGW("failed to deliver message as target handler %d is gone.", mTarget);
handler-&deliverMessage(this); //see AHandler@deliverMessage,前面通过looper post最后就是调用这里的deliever送到handler手里
消息处理类,一般当做父类,继承该类的子类需要实现onMessageReceived方法
void AHandler::deliverMessage(const sp&AMessage& &msg) {
onMessageReceived(msg);
mMessageCounter++;
与Ahander一一对应,负责存储消息并分发Ahandler的消息,与AMessage一对多关系
// posts a message on this looper with the given timeout
void ALooper::post(const sp&AMessage& &msg, int64_t delayUs) {
Mutex::Autolock autoLock(mLock);
int64_t whenUs;
if (delayUs & 0) {
whenUs = GetNowUs() + delayUs;
whenUs = GetNowUs();
List&Event&::iterator it = mEventQueue.begin();
while (it != mEventQueue.end() && (*it).mWhenUs &= whenUs) {
event.mWhenUs = whenUs;
event.mMessage =
if (it == mEventQueue.begin()) {
mQueueChangedCondition.signal();
mEventQueue.insert(it, event);
----------------------------------------------------------
status_t ALooper::start(
bool runOnCallingThread, bool canCall, int32_t priority) {
if (runOnCallingThread) {
Mutex::Autolock autoLock(mLock);
if (mThread != NULL || mRunningLocally) {
return INVALID_TION;
mRunningLocally =
} while (loop());
return OK;
Mutex::Autolock autoLock(mLock);
if (mThread != NULL || mRunningLocally) {
return INVALID_OPERATION;
mThread = new LooperThread(this, canCallJava);
status_t err = mThread-&run(
mName.empty() ? "ALooper" : mName.c_str(), priority);
if (err != OK) {
mThread.clear();
bool ALooper::loop() {
Mutex::Autolock autoLock(mLock);
if (mThread == NULL && !mRunningLocally) {
if (mEventQueue.empty()) {
mQueueChangedCondition.wait(mLock);
int64_t whenUs = (*mEventQueue.begin()).mWhenUs;
int64_t nowUs = GetNowUs();
if (whenUs & nowUs) {
int64_t delayUs = whenUs - nowUs;
mQueueChangedCondition.waitRelative(mLock, delayUs * 1000ll);
event = *mEventQueue.begin();
mEventQueue.erase(mEventQueue.begin());
event.mMessage-&deliver(); //see AHandler.deliverMessage
LooperThread
此线程调用ALooper的loop方法来分发消息
virtual status_t readyToRun() {
mThreadId = androidGetThreadId();
return Thread::readyToRun();
virtual bool threadLoop() {
return mLooper-&loop();
ALooperRoaster
与Handler是一对多的关系, 管理Looper和Handler一一对应关系,负责释放stale handler
ALooper::handler_id ALooperRoster::registerHandler(
const sp&ALooper& looper, const sp&AHandler& &handler) {
Mutex::Autolock autoLock(mLock);
if (handler-&id() != 0) {
CHECK(!"A handler must only be registered once.");
return INVALID_OPERATION;
info.mLooper =
info.mHandler =
ALooper::handler_id handlerID = mNextHandlerID++;//一对一
mHandlers.add(handlerID, info);//一对多
handler-&setID(handlerID, looper);
return handlerID;
void ALooperRoster::unregisterHandler(ALooper::handler_id handlerID) {
Mutex::Autolock autoLock(mLock);
ssize_t index = mHandlers.indexOfKey(handlerID);
if (index & 0) {
const HandlerInfo &info = mHandlers.valueAt(index);
sp&AHandler& handler = info.mHandler.promote();
if (handler != NULL) {
handler-&setID(0, NULL);
mHandlers.removeItemsAt(index);
void ALooperRoster::unregisterStaleHandlers() {
Vector&sp&ALooper& & activeL
Mutex::Autolock autoLock(mLock);
for (size_t i = mHandlers.size(); i & 0;) {
const HandlerInfo &info = mHandlers.valueAt(i);
sp&ALooper& looper = info.mLooper.promote();
if (looper == NULL) {
ALOGV("Unregistering stale handler %d", mHandlers.keyAt(i));
mHandlers.removeItemsAt(i);
// At this point 'looper' might be the only sp&& keeping
// the object alive. To prevent it from going out of scope
// and having ~ALooper call this method again recursively
// and then deadlocking because of the Autolock above, add
// it to a Vector which will go out of scope after the lock
// has been released.
activeLoopers.add(looper);
异步消息机制的创建
sp&ALooper& mLooper = new AL //创建一个Alooper实例
sp&AHandlerReflector& mHandler = new AHandlerReflector //创建一个Ahandler实例
mLooper-&setName(“xxxxx”); //设置looper名字
mLooper-&start(false, true, PRIORITY_XXX); //根据参数创建并启动 looper thread
mLooper-&regiserHandler(mHandler); //register handler 会调用AHandler的setID方法将looper设置到Handler里去
sp&AMessage& msg = new AMessage(kWhatSayGoodbye, mHandler); //在AMessage的构造方法里获取Ahandler对应的Looper并保存
msg-&post(); // 调用looper的post方法
Message Post的调用过程
Message::post
ALooper::post
mEventQueue.insert
mQueueChangedCondition.signal() //如果之前没有event,通知looper thread
ALooper::loop()
if (mEventQueue.empty()) { //如果消息队列为空,则等待
mQueueChangedCondition.wait(mLock);
event = *mEventQueue.begin();
event.mMessage-&deliver();
AHandler::deliverMessage
AHandlerReflector:: onMessageReceived
具体的实现
下面就进入我们的正题,NuPlayer
frameworks/av/media/libmediaplayerservice/nuplayer/
NuPlayerDriver
NuPlayerDriver是对NuPlayer的封装,继承MediaPlayerInterface接口。通过NuPlayer来实现播放的功能。看这部分代码的方法就是先看NuPlayerDriver里面干了啥,转头就去找NuPlayer里面的实现,一般都要再去NuPlayer的onMessageReceive中看消息的响应,最后回到NuPlayerDriver的各种notify中看流程的周转,下面附上一张播放器状态机流转图
NuPlayerDriver::NuPlayerDriver(pid_t pid)
: mState(STATE_IDLE),
//对应播放器状态机的初始化状态
mIsAsyncPrepare(false),
mAsyncResult(UNKNOWN_ERROR),
mSetSurfaceInProgress(false),
mDurationUs(-1),
mPositionUs(-1),
mSeekInProgress(false),
mLooper(new ALooper),
mPlayerFlags(0),
mAtEOS(false),
mLooping(false),
mAutoLoop(false) {
ALOGV("NuPlayerDriver(%p)", this);
//和前面所述的异步消息创建机制相符
 mLooper-&setName("NuPlayerDriver Looper");
mLooper-&start(
false, /* runOnCallingThread */
/* canCallJava */
PRIORITY_AUDIO);
 //mPlayer即NuPlayer,继承于AHandler
mPlayer = AVNuFactory::get()-&createNuPlayer(pid);
mLooper-&registerHandler(mPlayer);
mPlayer-&setDriver(this);
NuPlayerDriver::~NuPlayerDriver() {
ALOGV("~NuPlayerDriver(%p)", this);
mLooper-&stop();
//整个NuPlayerDriver就是一个大ALooper
AVNuFactory
负责关键的create,通过它能看到:
1.每一个NuPlayer对应一个进程
2.数据流从Source-Decoder-Renderer,中间由AMessages驱动
sp&NuPlayer& AVNuFactory::createNuPlayer(pid_t pid) {
return new NuPlayer(pid);
sp&NuPlayer::DecoderBase& AVNuFactory::createPassThruDecoder(
const sp&AMessage& &notify,
const sp&NuPlayer::Source& &source,
const sp&NuPlayer::Renderer& &renderer) {
return new NuPlayer::DecoderPassThrough(notify, source, renderer);
sp&NuPlayer::DecoderBase& AVNuFactory::createDecoder(
const sp&AMessage& &notify,
const sp&NuPlayer::Source& &source,
pid_t pid,
const sp&NuPlayer::Renderer& &renderer) {
return new NuPlayer::Decoder(notify, source, pid, renderer);
sp&NuPlayer::Renderer& AVNuFactory::createRenderer(
const sp&MediaPlayerBase::AudioSink& &sink,
const sp&AMessage& &notify,
uint32_t flags) {
return new NuPlayer::Renderer(sink, notify, flags);
下面分别分析Source, Decoder, Renderer
以setDataSource为切入点
status_t NuPlayerDriver::setDataSource(const sp&IStreamSource& &source) {
ALOGV("setDataSource(%p) stream source", this);
Mutex::Autolock autoLock(mLock);
if (mState != STATE_IDLE) {
return INVALID_OPERATION;
mState = STATE_SET_DATASOURCE_PENDING;
mPlayer-&setDataSourceAsync(source);//因为driver只是NuPlayer的封装,所以还是要去调用NuPlayer完成实际动作
while (mState == STATE_SET_DATASOURCE_PENDING) {
mCondition.wait(mLock);
return mAsyncR
--------------------------------------
void NuPlayer::setDataSourceAsync(const sp&IStreamSource& &source) {
sp&AMessage& msg = new AMessage(kWhatSetDataSource, this);
sp&AMessage& notify = new AMessage(kWhatSourceNotify, this);
msg-&setObject("source", new StreamingSource(notify, source));
msg-&post(); //到了NuPlayer中,也不是直接进行操作,而是先发个消息,验证前面所说的一切都由AMessage驱动
---------------------------------------
void NuPlayer::onMessageReceived(const sp&AMessage& &msg) {
switch (msg-&what()) {
case kWhatSetDataSource://实际的处理在这里
ALOGV("kWhatSetDataSource");
CHECK(mSource == NULL);
status_t err = OK;
sp&RefBase&
CHECK(msg-&findObject("source", &obj));
if (obj != NULL) {
Mutex::Autolock autoLock(mSourceLock);
mSource = static_cast&Source *&(obj.get());//赋值给mSource
err = UNKNOWN_ERROR;
CHECK(mDriver != NULL);
sp&NuPlayerDriver& driver = mDriver.promote();
if (driver != NULL) {
driver-&notifySetDataSourceCompleted(err);//通知driver设置完毕
---------------------------------------
void NuPlayerDriver::notifySetDataSourceCompleted(status_t err) {
Mutex::Autolock autoLock(mLock);
CHECK_EQ(mState, STATE_SET_DATASOURCE_PENDING);
mAsyncResult =
mState = (err == OK) ? STATE_UNPREPARED : STATE_IDLE;//回到driver中,流转播放器状态进入下一阶段
mCondition.broadcast();
下面就来看看具体有哪些source,它们都继承自NuPlayer:Source(NuPlayerSource.h & NuPlayerSource.cpp)
1.HTTP-进一步判断是以下的哪一种:HTTPLiveSource,RTSPSource,GenericSource
2.File-GenericSource
3.StreamSource-StreamingSource
4.DataSource-GenericSource
GenericSource
nuplayer/GenericSource.h & GenericSource.cpp
static int64_t kLowWaterMarkUs = 2000000
static int64_t kHighWaterMarkUs = 5000000
static int64_t kHighWaterMarkRebufferUs =
// 15secs,这一个是新增加的水位
static const ssize_t kLowWaterMarkBytes = 40000;
static const ssize_t kHighWaterMarkBytes = 200000;
status_t NuPlayer::GenericSource::initFromDataSource() {
get track info and metadata
void NuPlayer::GenericSource::prepareAsync() {
if (mLooper == NULL) {
mLooper = new AL
mLooper-&setName("generic");
mLooper-&start();
mLooper-&registerHandler(this);
sp&AMessage& msg = new AMessage(kWhatPrepareAsync, this);
msg-&post();
status_t NuPlayer::GenericSource::feedMoreTSData() {
return OK;
libstagefright/httplive/LiveSession.h & cpp
// Bandwidth Switch Mark Defaults
const int64_t LiveSession::kUpSwitchMarkUs =
const int64_t LiveSession::kDownSwitchMarkUs =
const int64_t LiveSession::kUpSwitchMarginUs = 5000000
const int64_t LiveSession::kResumeThresholdUs = 100000
// Buffer Prepare/Ready/Underflow Marks
const int64_t LiveSession::kReadyMarkUs = 5000000
const int64_t LiveSession::kPrepareMarkUs = 1500000
const int64_t LiveSession::kUnderflowMarkUs = 1000000
与Fetcher,Bandwidth Estimater(和ExoPlayer一样是滑动窗口平均),switching,Buffering相关的操作都在这里
HTTPLiveSource
nuplayer目录下
enum Flags {
// Don't log any URLs.不在log中记录URL
kFlagIncognito = 1,
NuPlayer::HTTPLiveSource::HTTPLiveSource(
if (headers) { //也搞了一个header机制
mExtraHeaders = *
ssize_t index =
mExtraHeaders.indexOfKey(String8("x-hide-urls-from-log"));
if (index &= 0) {
mFlags |= kFlagI
mExtraHeaders.removeItemsAt(index);
---------------------------------------
void NuPlayer::HTTPLiveSource::prepareAsync() {
if (mLiveLooper == NULL) {
mLiveLooper = new AL//一如既往的ALooper
mLiveLooper-&setName("http live");
mLiveLooper-&start();
mLiveLooper-&registerHandler(this);
sp&AMessage& notify = new AMessage(kWhatSessionNotify, this);
mLiveSession = new LiveSession(
(mFlags & kFlagIncognito) ? LiveSession::kFlagIncognito : 0,
mHTTPService);
mLiveLooper-&registerHandler(mLiveSession);
mLiveSession-&connectAsync(//HTTPLiveSource包含LiveSession,很多实际的工作都由LiveSession完成
mURL.c_str(), mExtraHeaders.isEmpty() ? NULL : &mExtraHeaders);
frameworks/av/media/libstagefright/mpeg2ts/ATSParser.cpp
就是一个TS Parser,虽然也叫Axx,但是没有消息机制在里面
StreamingSource
nuplayer目录
void NuPlayer::StreamingSource::prepareAsync() {
if (mLooper == NULL) {
mLooper = new AL
mLooper-&setName("streaming");
mLooper-&start();//何其相似
mLooper-&registerHandler(this);
notifyVideoSizeChanged();
notifyFlagsChanged(0);
notifyPrepared();
---------------------------------------
StreamingSource中的数据由onReadBuffer驱动,最后的EOS,Discontiunity等都交给ATSParser去处理,ATSParser又最终交给AnotherPacketSource去做真正的处理.实际上,这里提到的三个Source最后都会用到AnotherPacketSource
void NuPlayer::StreamingSource::onReadBuffer() {
for (int32_t i = 0; i & kNumListenerQueueP ++i) {
char buffer[188];
sp&AMessage&
ssize_t n = mStreamListener-&read(buffer, sizeof(buffer), &extra);//实际用NuPlayerStreamListener完成工作
if (n == 0) {
ALOGI("input data EOS reached.");
mTSParser-&signalEOS(ERROR_END_OF_STREAM);//EOS了
setError(ERROR_END_OF_STREAM);
} else if (n == INFO_DISCONTINUITY) {
int32_t type = ATSParser::DISCONTINUITY_TIME;
if (extra != NULL
&& extra-&findInt32(
IStreamListener::kKeyDiscontinuityMask, &mask)) {
if (mask == 0) {
ALOGE("Client specified an illegal discontinuity type.");
setError(ERROR_UNSUPPORTED);
mTSParser-&signalDiscontinuity(
(ATSParser::DiscontinuityType)type, extra);
} else if (n & 0) {
if (buffer[0] == 0x00) {
// XXX legacy
if (extra == NULL) {
extra = new AM
uint8_t type = buffer[1];
if (type & 2) {
int64_t mediaTimeUs;
memcpy(&mediaTimeUs, &buffer[2], sizeof(mediaTimeUs));
extra-&setInt64(IStreamListener::kKeyMediaTimeUs, mediaTimeUs);
mTSParser-&signalDiscontinuity(
((type & 1) == 0)
? ATSParser::DISCONTINUITY_TIME
: ATSParser::DISCONTINUITY_FORMATCHANGE,
status_t err = mTSParser-&feedTSPacket(buffer, sizeof(buffer));
if (err != OK) {
ALOGE("TS Parser returned error %d", err);
mTSParser-&signalEOS(err);
setError(err);
AnotherPacketSource
frameworks/av/media/libstagefright/mpeg2ts
可以类比ExoPlayer中的chunk source,同时负责buffer管理,EOS\Discontinuity的处理等等.前面三个Source最后都会落到AnotherPacketSource
bool AnotherPacketSource::hasBufferAvailable(status_t *finalResult) {
Mutex::Autolock autoLock(mLock);
*finalResult = OK;
if (!mEnabled) {
if (!mBuffers.empty()) {//一个ABuffer List,其实就是一个环形缓冲
*finalResult = mEOSR
--------------------------------------
void AnotherPacketSource::queueDiscontinuity(
ATSParser::DiscontinuityType type,
const sp&AMessage& &extra,
bool discard) {
Mutex::Autolock autoLock(mLock);
if (discard) {
// Leave only discontinuities in the queue.
mEOSResult = OK;
mLastQueuedTimeUs = 0;
mLatestEnqueuedMeta = NULL;
if (type == ATSParser::DISCONTINUITY_NONE) {
mDiscontinuitySegments.push_back(DiscontinuitySegment());
sp&ABuffer& buffer = new ABuffer(0);
buffer-&meta()-&setInt32("discontinuity", static_cast&int32_t&(type));
buffer-&meta()-&setMessage("extra", extra);
mBuffers.push_back(buffer); //将记录了discontinuity的ABuffer推入缓冲区中,这样各个Source在从缓冲区读数据的时候就能正确处理discontinuity了
mCondition.signal();
Decoder是如何被初始化的
从NuPlayer::OnStart方法看起
void NuPlayer::onStart(int64_t startPositionUs) {
if (!mSourceStarted) {
mSourceStarted =
mSource-&start();
if (startPositionUs & 0) {
performSeek(startPositionUs);
if (mSource-&getFormat(false /* audio */) == NULL) {
sp&AMessage& notify = new AMessage(kWhatRendererNotify, this);
++mRendererG
notify-&setInt32("generation", mRendererGeneration);
//在这里用AVNuFactory初始化Renderer和它对应的Looper
mRenderer = AVNuFactory::get()-&createRenderer(mAudioSink, notify, flags);
mRendererLooper = new AL
mRendererLooper-&setName("NuPlayerRenderer");
mRendererLooper-&start(false, false, ANDROID_PRIORITY_AUDIO);
mRendererLooper-&registerHandler(mRenderer);
 //设置Renderer的播放参数
status_t err = mRenderer-&setPlaybackSettings(mPlaybackSettings);
 //给Decoder设置Renderer,两者之间的关系建立起来了
if (mVideoDecoder != NULL) {
mVideoDecoder-&setRenderer(mRenderer);
if (mAudioDecoder != NULL) {
mAudioDecoder-&setRenderer(mRenderer);
 //抛出这一消息
postScanSources();
--------------------------------------
case kWhatScanSources:
CHECK(msg-&findInt32("generation", &generation));
if (generation != mScanSourcesGeneration) {
// Drop obsolete msg.
mScanSourcesPending =
ALOGV("scanning sources haveAudio=%d, haveVideo=%d",
mAudioDecoder != NULL, mVideoDecoder != NULL);
bool mHadAnySourcesBefore =
(mAudioDecoder != NULL) || (mVideoDecoder != NULL);
bool rescan =
// initialize video before audio because successful initialization of
// video may change deep buffer mode of audio.
   //在这里初始化decoder
if (mSurface != NULL) {
if (instantiateDecoder(false, &mVideoDecoder) == -EWOULDBLOCK) {
// Don't try to re-open audio sink if there's an existing decoder.
if (mAudioSink != NULL && mAudioDecoder == NULL) {
if (instantiateDecoder(true, &mAudioDecoder) == -EWOULDBLOCK) {
if (!mHadAnySourcesBefore
&& (mAudioDecoder != NULL || mVideoDecoder != NULL)) {
// This is the first time we've found anything playable.
if (mSourceFlags & Source::FLAG_DYNAMIC_DURATION) {
schedulePollDuration();
if ((err = mSource-&feedMoreTSData()) != OK) {
if (mAudioDecoder == NULL && mVideoDecoder == NULL) {
// We're not currently decoding anything (no audio or
// video tracks found) and we just ran out of input data.
if (err == ERROR_END_OF_STREAM) {
notifyListener(MEDIA_PLAYBACK_COMPLETE, 0, 0);
notifyListener(MEDIA_ERROR, MEDIA_ERROR_UNKNOWN, err);
//postScanSources和ExoPlayer中的doSomeWork一样是一个不断循环运转的东西
if (rescan) {
msg-&post(100000ll);
mScanSourcesPending =
---------------------------------------
在NuPlayer的instantiateDecoder中完成Decoder的初始化
status_t NuPlayer::instantiateDecoder(
bool audio, sp&DecoderBase& *decoder, bool checkAudioModeChange) {
if (!audio) {
CHECK(format-&findString("mime", &mime));
sp&AMessage& ccNotify = new AMessage(kWhatClosedCaptionNotify, this);
if (mCCDecoder == NULL) {
mCCDecoder = new CCDecoder(ccNotify); //new字幕解码器
if (mSourceFlags & Source::FLAG_SECURE) {
format-&setInt32("secure", true);
if (mSourceFlags & Source::FLAG_PROTECTED) {
format-&setInt32("protected", true);
float rate = getFrameRate();
if (rate & 0) {
format-&setFloat("operating-rate", rate * mPlaybackSettings.mSpeed);
if (audio) {
sp&AMessage& notify = new AMessage(kWhatAudioNotify, this);
++mAudioDecoderG
notify-&setInt32("generation", mAudioDecoderGeneration);
if (checkAudioModeChange) {
determineAudioModeChange(format);
if (mOffloadAudio)
mSource-&setOffloadAudio(true /* offload */);
if (mOffloadAudio) {
const bool hasVideo = (mSource-&getFormat(false /*audio */) != NULL);
format-&setInt32("has-video", hasVideo);
*decoder = AVNuFactory::get()-&createPassThruDecoder(notify, mSource, mRenderer);//利用AVNuFactory的方法创建pass through的音频解码器
AVNuUtils::get()-&setCodecOutputFormat(format);
mSource-&setOffloadAudio(false /* offload */);
*decoder = AVNuFactory::get()-&createDecoder(notify, mSource, mPID, mRenderer);//创建普通的音频解码器
sp&AMessage& notify = new AMessage(kWhatVideoNotify, this);
++mVideoDecoderG
notify-&setInt32("generation", mVideoDecoderGeneration);
*decoder = new Decoder(
notify, mSource, mPID, mRenderer, mSurface, mCCDecoder);
   //new视频解码器,这里还会把字幕解码器作为一个参数传进来
// enable FRC if high-quality AV sync is requested, even if not
// directly queuing to display, as this will even improve textureview
// playback.
char value[PROPERTY_VALUE_MAX];
if (property_get("persist.sys.media.avsync", value, NULL) &&
(!strcmp("1", value) || !strcasecmp("true", value))) {
format-&setInt32("auto-frc", 1);
(*decoder)-&init();//解码器初始化
(*decoder)-&configure(format);//解码器配置
// allocate buffers to decrypt widevine source buffers
if (!audio && (mSourceFlags & Source::FLAG_SECURE)) {
Vector&sp&ABuffer& & inputB
CHECK_EQ((*decoder)-&getInputBuffers(&inputBufs), (status_t)OK);
Vector&MediaBuffer *& mediaB
for (size_t i = 0; i & inputBufs.size(); i++) {
const sp&ABuffer& &buffer = inputBufs[i];
MediaBuffer *mbuf = new MediaBuffer(buffer-&data(), buffer-&size());
mediaBufs.push(mbuf);
status_t err = mSource-&setBuffers(audio, mediaBufs);
if (err != OK) {
for (size_t i = 0; i & mediaBufs.size(); ++i) {
mediaBufs[i]-&release();
mediaBufs.clear();
ALOGE("Secure source didn't support secure mediaBufs.");
return OK;
-------------------------------------
-------------------------------------
void NuPlayer::Decoder::doFlush(bool notifyComplete) {
if (mCCDecoder != NULL) {
mCCDecoder-&flush();//先flush字幕Decoder
if (mRenderer != NULL) {
mRenderer-&flush(mIsAudio, notifyComplete);
mRenderer-&signalTimeDiscontinuity();//再flush Renderer
status_t err = OK;
if (mCodec != NULL) {
err = mCodec-&flush();//最后flush Decoder
mCSDsToSubmit = mCSDsForCurrentF // copy operator
++mBufferG
releaseAndResetMediaBuffers();//清空buffer
DecoderBase
是NuPlayer::Decoder的基类,内部也维护了一个Looper,各种工作也依然是异步消息驱动完成的,只不过各种onMessage的响应方法都是虚函数,需要由子类来做具体的实现.也能看到利用setRenderer方法和Renderer结合到一起的方法
NuPlayer::DecoderBase::DecoderBase(const sp&AMessage& &notify)
mNotify(notify),
mBufferGeneration(0),
mPaused(false),
mStats(new AMessage),
mRequestInputBuffersPending(false) {
// Every decoder has its own looper because MediaCodec operations
// are blocking, but NuPlayer needs asynchronous operations.
mDecoderLooper = new AL
mDecoderLooper-&setName("NPDecoder");
mDecoderLooper-&start(false, false, ANDROID_PRIORITY_AUDIO);
void NuPlayer::DecoderBase::init() {
mDecoderLooper-&registerHandler(this);
NuPlayer::Decoder
由相应的onMessage方法处理各种工作
先关注init和configure两个方法,init直接继承自DecoderBase,就是给Looper注册Handler
--------------------------------------
void NuPlayer::Decoder::onConfigure(const sp&AMessage& &format) {
 mCodec = AVUtils::get()-&createCustomComponentByName(mCodecLooper, mime.c_str(), false /* encoder */, format);
if (mCodec == NULL) {
//由mimeType创建Decoder
mCodec = MediaCodec::CreateByType(
mCodecLooper, mime.c_str(), false /* encoder */, NULL /* err */, mPid);
//mCodec就是libstagefright中的MediaCodec,没什么可说的
err = mCodec-&configure(
format, mSurface, NULL /* crypto */, 0 /* flags */);
if (err != OK) {
ALOGE("Failed to configure %s decoder (err=%d)", mComponentName.c_str(), err);
mCodec-&release();
mCodec.clear();
handleError(err);
rememberCodecSpecificData(format);
// the following should work in configured state读Format信息
CHECK_EQ((status_t)OK, mCodec-&getOutputFormat(&mOutputFormat));
CHECK_EQ((status_t)OK, mCodec-&getInputFormat(&mInputFormat));
mStats-&setString("mime", mime.c_str());
mStats-&setString("component-name", mComponentName.c_str());
if (!mIsAudio) {
int32_t width,
if (mOutputFormat-&findInt32("width", &width)
&& mOutputFormat-&findInt32("height", &height)) {
mStats-&setInt32("width", width);
mStats-&setInt32("height", height);
//MediaCodec开始
err = mCodec-&start();
if (err != OK) {
ALOGE("Failed to start %s decoder (err=%d)", mComponentName.c_str(), err);
mCodec-&release();
mCodec.clear();
handleError(err);
//先把buffer都release置为null
releaseAndResetMediaBuffers();
前面提到NuPlayer::Decoder里面有一个MediaCodec,所以不用再研究具体怎么解码的,关注点在于从这个模块输出的东西是怎样的,以及是怎样输入这个东西的
先来看输出:
当MediaCodec有Available Output的时候,在onMessageReceived中有
case MediaCodec::CB_OUTPUT_AVAILABLE:
int64_t timeUs;
CHECK(msg-&findInt32("index", &index));
CHECK(msg-&findSize("offset", &offset));
CHECK(msg-&findSize("size", &size));
CHECK(msg-&findInt64("timeUs", &timeUs));
CHECK(msg-&findInt32("flags", &flags));
handleAnOutputBuffer(index, offset, size, timeUs, flags);
-------------------------------------
bool NuPlayer::Decoder::handleAnOutputBuffer(
size_t index,
size_t offset,
size_t size,
int64_t timeUs,
int32_t flags) {
CHECK_LT(bufferIx, mOutputBuffers.size());
sp&ABuffer&
mCodec-&getOutputBuffer(index, &buffer);
 //发送kWhatRenderBuffer消息
sp&AMessage& reply = new AMessage(kWhatRenderBuffer, this);
reply-&setSize("buffer-ix", index);
reply-&setInt32("generation", mBufferGeneration);
if (eos) {
ALOGI("[%s] saw output EOS", mIsAudio ? "audio" : "video");
    //EOS了
buffer-&meta()-&setInt32("eos", true);
reply-&setInt32("eos", true);
} else if (mSkRenderingUntilMediaTimeUs &= 0) {
if (timeUs & mSkipRenderingUntilMediaTimeUs) {
ALOGV("[%s] dropping buffer at time %lld as requested.",
mComponentName.c_str(), (long long)timeUs);
    //中间这一段不用render,skip掉
reply-&post();
mSkipRenderingUntilMediaTimeUs = -1;
} else if ((flags & MediaCodec::BUFFER_FLAG_DATACORRUPT) &&
AVNuUtils::get()-&dropCorruptFrame()) {
ALOGV("[%s] dropping corrupt buffer at time %lld as requested.",
mComponentName.c_str(), (long long)timeUs);
  //本段buffer坏到了,扔掉
reply-&post();
mNumFramesTotal += !mIsA
// wait until 1st frame comes out to signal resume complete
notifyResumeCompleteIfNecessary();
if (mRenderer != NULL) {
// send the buffer to renderer.把Buffer送到Renderer
mRenderer-&queueBuffer(mIsAudio, buffer, reply);
if (eos && !isDiscontinuityPending()) {
mRenderer-&queueEOS(mIsAudio, ERROR_END_OF_STREAM);
--------------------------------------
case kWhatRenderBuffer:
if (!isStaleReply(msg)) {
onRenderBuffer(msg);
-------------------------------------
void NuPlayer::Decoder::onRenderBuffer(const sp&AMessage& &msg) {
if (!mIsAudio) {
int64_t timeUs;
sp&ABuffer& buffer = mOutputBuffers[bufferIx];
buffer-&meta()-&findInt64("timeUs", &timeUs);
if (mCCDecoder != NULL && mCCDecoder-&isSelected()) {
mCCDecoder-&display(timeUs);//字幕显示
if (msg-&findInt32("render", &render) && render) {
int64_t timestampNs;
CHECK(msg-&findInt64("timestampNs", &timestampNs));
//由MediaCodec的renderOutputBufferAndRelease完成
err = mCodec-&renderOutputBufferAndRelease(bufferIx, timestampNs);
mNumOutputFramesDropped += !mIsA
err = mCodec-&releautputBuffer(bufferIx);
再来看输入
当MediaCodec有Available Input的时候,在onMessageReceived中有
case MediaCodec::CB_INPUT_AVAILABLE:
CHECK(msg-&findInt32("index", &index));
handleAnInputBuffer(index);
------------------------------------
bool NuPlayer::Decoder::handleAnInputBuffer(size_t index) {
sp&ABuffer&
mCodec-&getInputBuffer(index, &buffer);
if (index &= mInputBuffers.size()) {
for (size_t i = mInputBuffers.size(); i &= ++i) {
mInputBuffers.add();
mMediaBuffers.add();
mInputBufferIsDequeued.add();
mMediaBuffers.editItemAt(i) = NULL;
mInputBufferIsDequeued.editItemAt(i) =
mInputBuffers.editItemAt(index) =
//CHECK_LT(bufferIx, mInputBuffers.size());
if (mMediaBuffers[index] != NULL) {
mMediaBuffers[index]-&release();
mMediaBuffers.editItemAt(index) = NULL;
mInputBufferIsDequeued.editItemAt(index) =
if (!mCSDsToSubmit.isEmpty()) {
sp&AMessage& msg = new AMessage();
msg-&setSize("buffer-ix", index);
sp&ABuffer& buffer = mCSDsToSubmit.itemAt(0);
ALOGI("[%s] resubmitting CSD", mComponentName.c_str());
msg-&setBuffer("buffer", buffer);
mCSDsToSubmit.removeAt(0);
if (!onInputBufferFetched(msg)) {
handleError(UNKNOWN_ERROR);
while (!mPendingInputMessages.empty()) {
sp&AMessage& msg = *mPendingInputMessages.begin();
if (!onInputBufferFetched(msg)) {
mPendingInputMessages.erase(mPendingInputMessages.begin());
if (!mInputBufferIsDequeued.editItemAt(index)) {
mDequeuedInputBuffers.push_back(index);
onRequestInputBuffers();
------------------------------------
bool NuPlayer::Decoder::onInputBufferFetched(const sp&AMessage& &msg) {
sp&ABuffer&
bool hasBuffer = msg-&findBuffer("buffer", &buffer);
// handle widevine classic source - that fills an arbitrary input buffer
MediaBuffer *mediaBuffer = NULL;
if (hasBuffer) {
   //TODO:更多信息,可以研究mediabuffer
mediaBuffer = (MediaBuffer *)(buffer-&getMediaBufferBase());
if (mediaBuffer != NULL) {
// likely filled another buffer than we requested: adjust buffer index
for (ix = 0; ix & mInputBuffers.size(); ix++) {
const sp&ABuffer& &buf = mInputBuffers[ix];
if (buf-&data() == mediaBuffer-&data()) {
// all input buffers are dequeued on start, hence the check
if (!mInputBufferIsDequeued[ix]) {
ALOGV("[%s] received MediaBuffer for #%zu instead of #%zu",
mComponentName.c_str(), ix, bufferIx);
mediaBuffer-&release();
// TRICKY: need buffer for the metadata, so instead, set
// codecBuffer to the same (though incorrect) buffer to
// avoid a memcpy into the codecBuffer
codecBuffer =
codecBuffer-&setRange(
mediaBuffer-&range_offset(),
mediaBuffer-&range_length());
bufferIx =
CHECK(ix & mInputBuffers.size());
if (buffer == NULL /* includes !hasBuffer */) {
int32_t streamErr = ERROR_END_OF_STREAM;
CHECK(msg-&findInt32("err", &streamErr) || !hasBuffer);
CHECK(streamErr != OK);
// attempt to queue EOS
status_t err = mCodec-&queueInputBuffer(
MediaCodec::BUFFER_FLAG_EOS);
if (err == OK) {
mInputBufferIsDequeued.editItemAt(bufferIx) =
} else if (streamErr == ERROR_END_OF_STREAM) {
streamErr =
// err will not be ERROR_END_OF_STREAM
if (streamErr != ERROR_END_OF_STREAM) {
ALOGE("Stream error for %s (err=%d), EOS %s queued",
mComponentName.c_str(),
streamErr,
err == OK ? "successfully" : "unsuccessfully");
handleError(streamErr);
sp&AMessage&
if (buffer-&meta()-&findMessage("extra", &extra) && extra != NULL) {
int64_t resumeAtMediaTimeUs;
if (extra-&findInt64(
"resume-at-mediaTimeUs", &resumeAtMediaTimeUs)) {
ALOGI("[%s] suppressing rendering until %lld us",
mComponentName.c_str(), (long long)resumeAtMediaTimeUs);
mSkipRenderingUntilMediaTimeUs = resumeAtMediaTimeUs;
int64_t timeUs = 0;
uint32_t flags = 0;
CHECK(buffer-&meta()-&findInt64("timeUs", &timeUs));
int32_t eos,
// we do not expect SYNCFRAME for decoder
if (buffer-&meta()-&findInt32("eos", &eos) && eos) {
flags |= MediaCodec::BUFFER_FLAG_EOS;
} else if (buffer-&meta()-&findInt32("csd", &csd) && csd) {
flags |= MediaCodec::BUFFER_FLAG_CODECCONFIG;
// copy into codec buffer
if (buffer != codecBuffer) {
if (buffer-&size() & codecBuffer-&capacity()) {
handleError(ERROR_BUFFER_TOO_SMALL);
mDequeuedInputBuffers.push_back(bufferIx);
codecBuffer-&setRange(0, buffer-&size());
memcpy(codecBuffer-&data(), buffer-&data(), buffer-&size());
status_t err = mCodec-&queueInputBuffer(
codecBuffer-&offset(),
codecBuffer-&size(),
if (err != OK) {
if (mediaBuffer != NULL) {
mediaBuffer-&release();
ALOGE("Failed to queue input buffer for %s (err=%d)",
mComponentName.c_str(), err);
handleError(err);
mInputBufferIsDequeued.editItemAt(bufferIx) =
if (mediaBuffer != NULL) {
CHECK(mMediaBuffers[bufferIx] == NULL);
mMediaBuffers.editItemAt(bufferIx) = mediaB
}不管做什么,最后都是MediaCodec.queueInputBuffer完成了实际工作
-------------------------------------
void NuPlayer::DecoderBase::onRequestInputBuffers() {
if (mRequestInputBuffersPending) {
// doRequestBuffers() return true if we should request more data
if (doRequestBuffers()) {
mRequestInputBuffersPending =
   //注意这里,会自己循环调用,不停地request
sp&AMessage& msg = new AMessage(kWhatRequestInputBuffers, this);
msg-&post(10 * 1000ll);
--------------------------------------
case kWhatRequestInputBuffers:
mRequestInputBuffersPending =
onRequestInputBuffers();
--------------------------------------
* returns true if we should request more data
bool NuPlayer::Decoder::doRequestBuffers() {
// mRenderer is only NULL if we have a legacy widevine source that
// is not yet ready. In this case we must not fetch input.
if (isDiscontinuityPending() || mRenderer == NULL) {
status_t err = OK;
while (err == OK && !mDequeuedInputBuffers.empty()) {
size_t bufferIx = *mDequeuedInputBuffers.begin();
sp&AMessage& msg = new AMessage();
msg-&setSize("buffer-ix", bufferIx);
err = fetchInputData(msg);
if (err != OK && err != ERROR_END_OF_STREAM) {
// if EOS, need to queue EOS buffer
mDequeuedInputBuffers.erase(mDequeuedInputBuffers.begin());
if (!mPendingInputMessages.empty()
|| !onInputBufferFetched(msg)) {  //前面分析过这个方法了
mPendingInputMessages.push_back(msg);
return err == -EWOULDBLOCK
&& mSource-&feedMoreTSData() == OK;
-------------------------------------
status_t NuPlayer::Decoder::fetchInputData(sp&AMessage& &reply) {
sp&ABuffer& U
bool dropAccessU
//在这里调用source的方法,从而建立起了联系
status_t err = mSource-&dequeueAccessUnit(mIsAudio, &accessUnit);
}while(...)
------------------------------------
至此,Decoder部分也分析完毕了
nuplayer目录下
Renderer的初始化在NuPlayer::OnStart方法中完成
void NuPlayer::onStart(int64_t startPositionUs) {
sp&AMessage& notify = new AMessage(kWhatRendererNotify, this);
++mRendererG
notify-&setInt32("generation", mRendererGeneration);
mRenderer = AVNuFactory::get()-&createRenderer(mAudioSink, notify, flags);
mRendererLooper = new AL
mRendererLooper-&setName("NuPlayerRenderer");
mRendererLooper-&start(false, false, ANDROID_PRIORITY_AUDIO);
mRendererLooper-&registerHandler(mRenderer);
status_t err = mRenderer-&setPlaybackSettings(mPlaybackSettings);
float rate = getFrameRate();
if (rate & 0) {
mRenderer-&setVideoFrameRate(rate);
if (mVideoDecoder != NULL) {
mVideoDecoder-&setRenderer(mRenderer);
if (mAudioDecoder != NULL) {
mAudioDecoder-&setRenderer(mRenderer);
postScanSources();
Renderer的数据输入在NuPlayer::Decoder::handleAnOutputBuffer中完成
bool NuPlayer::Decoder::handleAnOutputBuffer(
size_t index,
size_t offset,
size_t size,
int64_t timeUs,
int32_t flags) {
if (mRenderer != NULL) {
// send the buffer to renderer.把Buffer送到Renderer
mRenderer-&queueBuffer(mIsAudio, buffer, reply);
if (eos && !isDiscontinuityPending()) {
mRenderer-&queueEOS(mIsAudio, ERROR_END_OF_STREAM);
libmediaplayerservice/nuplayer/NuplayerRenderer.cpp
mRenderer-&queueBuffer最终会调用下面的方法
void NuPlayer::Renderer::onQueueBuffer(const sp&AMessage& &msg) {
CHECK(msg-&findInt32("audio", &audio));
if (dropBufferIfStale(audio, msg)) {
if (audio) {
mHasAudio =
mHasVideo =
if (mHasVideo) {
if (mVideoScheduler == NULL) {
mVideoScheduler = new VideoFrameScheduler();//初始化VideoFrameSche,用于VSync
mVideoScheduler-&init();
sp&ABuffer&
CHECK(msg-&findBuffer("buffer", &buffer));
sp&AMessage& notifyC
CHECK(msg-&findMessage("notifyConsumed", &notifyConsumed));
QueueE//可以理解为buffer循环队列的抽象
entry.mBuffer =
entry.mNotifyConsumed = notifyC
entry.mOffset = 0;
entry.mFinalResult = OK;
entry.mBufferOrdinal = ++mTotalBuffersQ
//mAudioQueue和mVideoQueue都是List&QueueEntry&
if (audio) {
Mutex::Autolock autoLock(mLock);
mAudioQueue.push_back(entry);
postDrainAudioQueue_l();
mVideoQueue.push_back(entry);
postDrainVideoQueue();//每隔一段时间就被调用一次的
Mutex::Autolock autoLock(mLock);
if (!mSyncQueues || mAudioQueue.empty() || mVideoQueue.empty()) {
sp&ABuffer& firstAudioBuffer = (*mAudioQueue.begin()).mB
sp&ABuffer& firstVideoBuffer = (*mVideoQueue.begin()).mB
if (firstAudioBuffer == NULL || firstVideoBuffer == NULL) {
// EOS signalled on either queue.某一个queue空了
syncQueuesDone_l();
int64_t firstAudioTimeUs;
int64_t firstVideoTimeUs;
CHECK(firstAudioBuffer-&meta()
-&findInt64("timeUs", &firstAudioTimeUs));
CHECK(firstVideoBuffer-&meta()
-&findInt64("timeUs", &firstVideoTimeUs));
int64_t diff = firstVideoTimeUs - firstAudioTimeUs;
ALOGV("queueDiff = %.2f secs", diff / 1E6);
if (diff & 100000ll) {
// Audio data starts More than 0.1 secs before video.
// Drop some audio.音频超前视频0.1s,丢掉一些音频
(*mAudioQueue.begin()).mNotifyConsumed-&post();
mAudioQueue.erase(mAudioQueue.begin());
syncQueuesDone_l();
Renderer的数据输出
同样是在postDrainVideoQueue方法中,会抛出kwhatDrainVideoQueue的消息
void NuPlayer::Renderer::postDrainVideoQueue() {
sp&AMessage& msg = new AMessage(kWhatDrainVideoQueue, this);
msg-&setInt32("drainGeneration", getDrainGeneration(false /* audio */));
case kWhatDrainVideoQueue:
CHECK(msg-&findInt32("drainGeneration", &generation));
if (generation != getDrainGeneration(false /* audio */)) {
mDrainVideoQueuePending =
onDrainVideoQueue();
postDrainVideoQueue();
void NuPlayer::Renderer::onDrainVideoQueue() {
QueueEntry *entry = &*mVideoQueue.begin();
if (entry-&mBuffer == NULL) {
notifyEOS(false /* audio */, entry-&mFinalResult);
int64_t nowUs = ALooper::GetNowUs();
int64_t realTimeUs;
int64_t mediaTimeUs = -1;
if (mFlags & FLAG_REAL_TIME) {
CHECK(entry-&mBuffer-&meta()-&findInt64("timeUs", &realTimeUs));
CHECK(entry-&mBuffer-&meta()-&findInt64("timeUs", &mediaTimeUs));
realTimeUs = getRealTimeUs(mediaTimeUs, nowUs);
bool tooLate =
if (!mPaused) {
//比较媒体时间和wall clock
setVideoLateByUs(nowUs - realTimeUs);
tooLate = (mVideoLateByUs & 40000);
if (tooLate) {
ALOGV("video late by %lld us (%.2f secs)",
(long long)mVideoLateByUs, mVideoLateByUs / 1E6);
int64_t mediaUs = 0;
mMediaClock-&getMediaTime(realTimeUs, &mediaUs);
ALOGV("rendering video at media time %.2f secs",
(mFlags & FLAG_REAL_TIME ? realTimeUs :
mediaUs) / 1E6);...
setVideoLateByUs(0);
// Always render the first video frame while keeping stats on A/V sync.
if (!mVideoSampleReceived) {
realTimeUs = nowUs;
entry-&mNotifyConsumed-&setInt64("timestampNs", realTimeUs * 1000ll);
entry-&mNotifyConsumed-&setInt32("render", !tooLate);
entry-&mNotifyConsumed-&post();
mVideoQueue.erase(mVideoQueue.begin());
entry = NULL;
mVideoSampleReceived =
if (!mPaused) {
if (!mVideoRenderingStarted) {
mVideoRenderingStarted =
notifyVideoRenderingStart();//刚刚开始
Mutex::Autolock autoLock(mLock);
notifyIfMediaRenderingStarted_l();//started,和上面的notifyVideoRenderingStart走的基本是同一条路,都会post对应的kwhatxxxx msg,最后回到NuPlayer的onMessage中被处理
case kWhatRendererNotify:
CHECK(msg-&findInt32("what", &what));
//根据what内容的不同能看到各种熟悉的消息
if (what == Renderer::kWhatEOS) {
if (audio) {
mAudioEOS =
mVideoEOS =
if (finalResult == ERROR_END_OF_STREAM) {
ALOGV("reached %s EOS", audio ? "audio" : "video");
ALOGE("%s track encountered an error (%d)",
audio ? "audio" : "video", finalResult);
notifyListener(
MEDIA_ERROR, MEDIA_ERROR_UNKNOWN, finalResult);
if ((mAudioEOS || mAudioDecoder == NULL)
&& (mVideoEOS || mVideoDecoder == NULL)) {
notifyListener(MEDIA_PLAYBACK_COMPLETE, 0, 0);
} else if (what == Renderer::kWhatFlushComplete) {
CHECK(msg-&findInt32("audio", &audio));
if (audio) {
mAudioEOS =
mVideoEOS =
ALOGV("renderer %s flush completed.", audio ? "audio" : "video");
if (audio && (mFlushingAudio == NONE || mFlushingAudio == FLUSHED
|| mFlushingAudio == SHUT_DOWN)) {
// Flush has been handled by tear down.
handleFlushComplete(audio, false /* isDecoder */);
finishFlushIfPossible();
} else if (what == Renderer::kWhatVideoRenderingStart) {
//对应前面的第一个,也就是刚刚开始render
notifyListener(MEDIA_INFO, MEDIA_INFO_RENDERING_START, 0);
} else if (what == Renderer::kWhatMediaRenderingStart) {
ALOGV("media rendering started");
//对应前面的第二个
notifyListener(MEDIA_STARTED, 0, 0);
} else if (what == Renderer::kWhatAudioTearDown) {
framework\av\include\media
// The player just pushed the very first video frame for rendering
enum media_info_type {
MEDIA_INFO_RENDERING_START = 3,
enum media_event_type {
MEDIA_STARTED
至此,就完成对android中nuplayer的ahandler机制和source\decoder\renderer三个模块的分析。欢迎互相交流学习。
优质网站模板

我要回帖

更多关于 刀塔传奇开服时间表 的文章

 

随机推荐