梦见捡花生是什么意思| 七星瓢虫吃什么食物| 沙和尚是什么妖怪| 月经一直不干净是什么原因引起的| 晚上喝柠檬水有什么好处| 喝什么茶去火| 屡试不爽是什么意思| 鹅蛋脸适合什么样的发型| 从容的反义词是什么| 什么鱼不能吃| 荔枝什么季节成熟| 衣服36码相当于什么码| 山梨酸钾是什么添加剂| 吃什么水果对肺好| mm是什么意思| 为什么来姨妈会拉肚子| 富态是什么意思| 老丈人是什么意思| os是什么| 阳历6月21日是什么星座| 小拇指和无名指发麻是什么原因| 针对性是什么意思| 中性粒细胞偏高是什么意思| 迪奥是什么品牌| 刻代表什么生肖| 拿铁是什么咖啡| 10.19什么星座| 龟头炎是什么| 西瓜有什么功效| 拔完牙可以吃什么| 烟草属于什么行业| 味精是什么做的| 陌上人如玉是什么意思| 最毒的蛇是什么蛇| 春回大地是什么生肖| 想睡睡不着是什么原因| 五指毛桃根有什么功效| 肠胃炎可以吃什么| fredperry是什么牌子| 迪士尼是什么意思| 科技布是什么材质| 胆黄素高是怎么回事有什么危害| 吃什么爱放屁| 气道高反应是什么意思| 护理部主任是什么级别| 什么越来越什么| 睡觉为什么要枕枕头| 海参崴买什么便宜| 冠状沟是什么| 哪些动物的尾巴有什么作用| 双鱼座上升星座是什么| darling什么意思| na是什么| 体癣用什么药| 1983属什么生肖| 西瓜对人体有什么好处| 中医行业五行属什么| 肝硬化吃什么食物好| 尿分叉是什么原因引起的| 为所当为什么意思| 梦见梳头发是什么意思| 心脏24小时监测叫什么| 小资情调是什么意思| 无偿献血有什么待遇| 组胺是什么| 晚的反义词是什么| 如你所愿是什么意思| 胎儿畸形是什么原因造成的| 手指红肿是什么原因| 月经老是提前是什么原因| 腺肌症有什么症状表现| 芒果有什么营养| 重要是什么意思| 治疗肺部气肿有什么方法| 什么是敏感肌| 百年灵手表什么档次| 君子兰不开花是什么原因| 两边白头发多是什么原因造成的| 鸡蛋和什么不能一起吃| 做梦梦到蟒蛇是什么征兆| 辩驳是什么意思| 骨裂什么症状| poc是什么| 反流性食管炎吃什么药最好| 国家电网需要什么专业| 2008属什么生肖| 阴道炎吃什么消炎药| 女生做彩超是检查什么| 月经量多是什么原因| 尿素是什么意思| cnv是什么意思| 梦见自己疯了什么意思| 般若波罗蜜是什么意思| 5月29日什么星座| 六爻小说讲的什么| 芭比q是什么意思| 托梦是什么意思| 生肖羊和什么生肖相冲| 为什么叫书记| 夏至节气吃什么| cba新赛季什么时候开始| 男人练臀桥有什么用| 手心脚心发热吃什么药| 谷丙转氨酶高是什么原因| 电磁炉上可以放什么锅| 体积是什么意思| 中药什么时候喝| 毕婚族是什么意思| 脸上长白斑是什么原因| 右耳朵疼是什么原因| 脚趾甲真菌感染用什么药| 佃农是什么意思| 郑声是什么意思| 肚脐上三指是什么地方| 听之任之是什么意思| 巧克力的原料是什么| 牛乳是什么| 圣诞节在什么时候| 厕所里应该摆什么花| 流清鼻涕是什么感冒| 经常喝苏打水有什么好处和坏处| 晨字属于五行属什么| 什么动物没有方向感| 八月二十是什么星座| 鲁字五行属什么| 什么是洗钱| 中暑喝什么药| 下肢动脉硬化吃什么药| 女性排卵期出血是什么原因| 乳腺化疗期间吃什么| 阴虚血热什么症状| 沉默寡言是什么意思| 猫吃什么食物| 尿酸高什么东西不能吃| 荸荠的读音是什么| 梭织面料是什么面料| 外贸原单是什么意思| 计抛是什么意思| 京东自营是什么意思| 血小板低有什么危害| ieg是什么意思| 彩虹旗是什么意思| 且慢是什么意思| 小孩热感冒吃什么药好| 脱脂牛奶适合什么人喝| 抑制素b检测是什么意思| 女人鼻头有痣代表什么| 尿隐血是什么问题| 金色和什么颜色搭配好看| 梦见蛇追我是什么预兆| 凯旋归来是什么意思| 9月24日什么星座| 失眠多梦挂什么科| 湿疹为什么一热就出来| 成都有什么特产| 小便不舒服吃什么药| 补气血吃什么最好最快| 泉水什么| 杀虫剂中毒有什么症状| 有什么鱼| 线束厂是做什么的| 有偿是什么意思| 龙的幸运色是什么颜色| 质子是什么| wc的完整形式是什么| 寿司的米饭是什么米| 一直打嗝不止是什么原因| 磨人的小妖精是什么意思| 四月初五是什么星座| 喜欢紫色的女人是什么性格| 关节炎吃什么药最好| 鸟死在家里是什么征兆| 腰扭伤用什么药最好| 早上七八点是什么时辰| 脚板肿是什么原因引起的| 阴道口溃疡用什么药| sey什么意思| 梦见拔花生是什么预兆| 什么牌子的学习机好| 知柏地黄丸治疗什么病| 1997年是什么生肖| 1990属马的是什么命| 孝顺的真正含义是什么| 今年农历什么年| 花子是什么意思| 松香是什么| 墨西哥人是什么人种| 多云是什么意思| 1月1号什么星座| 红海是什么意思| 阳萎是什么意思| 盆腔炎用什么消炎药好| 缺氧是什么症状| 水的ph值是什么意思| 滑膜增厚是什么意思| 什么情况下需要做肠镜| 香蕉有什么功效和作用| 刘晓庆什么星座| 日本人为什么长寿| 卢森堡为什么那么有钱| fwb什么意思| 绿色食品指什么食品| 早上不晨勃是什么原因| 灵芝不能和什么一起吃| 吃什么有饱腹感还减肥| 什么心什么心| 水色是什么颜色| 龙凤呈祥的意思是什么| 热毒是什么| 怀孕肚子上长毛是什么原因| 结肠炎挂什么科| 小孩晚上睡觉出汗是什么原因| 阳虚吃什么调理| 类固醇是什么药| 窦性心律左室高电压什么意思| 凌晨2点是什么时辰| 黑眼圈挂什么科| 精囊在什么位置| 诸葛亮是个什么样的人| 北芪煲汤加什么药材好| seifini是什么牌子| 忏悔什么意思| 秦始皇叫什么名字| 手足口病咳嗽吃什么药| 葫芦是什么意思| 红细胞计数偏高是什么意思| 下午五点到七点是什么时辰| 生姜泡醋有什么功效| 癫痫病是什么病| 手掌疼是什么原因| 为什么叫中国| 煮花生放什么调料好吃| 脾阴虚吃什么中成药| 滴水观音叶子发黄是什么原因| 经常头晕头疼是什么原因| 黄色加蓝色等于什么颜色| 兴奋是什么意思| 五十八岁属什么生肖| 宝批龙什么意思| 什么水果补钾| 胃一阵一阵的疼是什么原因| 腰间盘突出用什么药| 越什么越什么的词语| 梦到被狗咬是什么意思| 手术刀口吃什么愈合快| 骨结核吃什么药效果好| 七是什么意思| 梭织面料是什么面料| 猪日冲蛇什么意思| 强肉弱食是什么意思| 什么花什么门的成语| 九月三号是什么日子| 倾向是什么意思| 安徽属于什么地区| 闹代表什么生肖| 刺五加配什么药治失眠| 安全期是指什么时间| 五官端正是什么意思| 南京有什么特产可以带回家| 疱疹感染是什么病| 上尉军衔是什么级别| 胃口疼是什么原因| 为什么夏天热冬天冷| 老年人出现幻觉是什么原因| 百度

View in English

  • Global Nav Open Menu Global Nav Close Menu
  • Apple Developer
Search
Cancel
  • Apple Developer
  • News
  • Discover
  • Design
  • Develop
  • Distribute
  • Support
  • Account
Only search within “”

Quick Links

5 Quick Links

Videos

Open Menu Close Menu
  • Collections
  • Topics
  • All Videos
  • About

More Videos

  • About
  • Transcript
  • Code
  • 手机充电桩窃取信息?这里有一份正确使用指南

    百度 地暖可谓近年来取暖界的新贵,但地暖技术却有着悠久的历史。

    Transform your audio and video content into fragmented MPEG-4 files for a faster and smoother HLS streaming experience. Learn how to work with the fragmented MPEG-4 format, generate fragmented content from a movie, and set up AVAssetWriter to create fragments for HLS output.

    Resources

    • Writing Fragmented MPEG-4 Files for HTTP Live Streaming
      • HD Video
      • SD Video
  • Search this video…

    Hello and welcome to WWDC.

    Hello. My name is Takayuki Mizuno. I am a CoreMedia engineer at Apple. This session is about a new feature of AVFoundation for writing fragmented MP4 files for HLS. Here is a diagram that shows a typical workflow for HLS. There is a source material. This may be a video on demand material or live material. There is a part that encodes media data. There the video samples are encoded to, for example, HEVC, and the audio samples are encoded to, for example, AAC.

    Then there is a segmenter here...

    which segments media data in a specific format.

    The segmenter also creates a playlist that lists those segments at the same time.

    Finally, there is a web server that hosts the content. AVFoundation provides new features that is useful mainly for what the segmenter does.

    AVFoundation has AVAssetWriter to allow media authoring applications to write media data to a new file of a specified container type such as MP4. We enhanced it to output the media data in fragmented MP4 format for HLS.

    Apple HLS supports other formats such as MPEG-2 transport stream, ADTS and MPEG audio. But this new feature is specific to fragmented MP4. Here is an example of a use case.

    My example is that the source is video on demand material.

    AVFoundation has AVAssetReader, which is an object to read sample data from a media file.

    You read sample data from a source movie, push the samples to AVAssetWriter and write the fragmented MP4 segment files.

    Another example is that the source is live material.

    AVFoundation has AVCaptureVideoDataOutput and AVCaptureAudioDataOutput, which are objects to provide you with captured sample data from a connected device.

    You receive sample data from a device, push the samples to AVAssetWriter and write the fragmented MP4 segment files. Fragmented MP4 is a streaming media format based on ISO base media file format. Apple HTTP Live Streaming has supported fragmented MP4 since 2016. Here's a basic diagram of a regular MP4 file. It starts with a file type box that indicates which of the file formats this file conforms to.

    There is a movie box that is organizing the information about all that sample data, then there is a media data box that contains all that sample data. The movie box contains the information relevant to the entire sample such as audio and video codecs.

    It also contains references to the sample data such as location of the sample data. The order of most of those boxes is arbitrary.

    The file type box has to come first, but the other boxes can come in pretty much any order. If you have used AVAssetWriter, you may know that AVAssetWriter has a movieFragmentInterval property. This specifies the frequency at which movie fragments will be written.

    The resulting movie is what is called a fragmented movie. Here is a basic diagram of a fragmented movie file.

    There are movie fragment boxes in this movie which referenced all the sample data held in the media data box followed by it.

    This format structure is useful, for example, for live capture since even if writing is unexpectedly interrupted by a crash or something, data that is partially written is still accessible and playable.

    Please note that this fragmented movie case, if writing finishes successfully, AVAssetWriter defragments the file as the last step. So, this ends up making a regular movie file.

    Here is a basic diagram of a fragmented MP4 file. The file type box comes first, and the movie box comes after the file type box. Then there is a movie fragment box which is followed by a media data box. And after that, a pair of movie fragment boxes and the media data box continues. The order of those boxes should be like this. The major difference between fragmented movie and the fragmented MP4 is that the movie box of the fragmented MP4 does not contain references to the sample data. It only contains the information relevant to the entire samples, and the order of the movie fragment box and the media data box is reversed.

    Now I'll talk about how to use AVAssetWriter. This shows how to create an instance of AVAssetWriter. You do not provide output URL, since AVAssetWriter does not write a file but outputs data. You just have to specify output content type. Fragmented MP4 conforms to the MP4 family, so you should always specify UTType with AVFileType.MP4. You create AVAssetWriterInput. In this example, compressionSettings, which is a dictionary, is provided to encode the media samples. Alternatively, you can set nil to the outputSettings, which is called passthrough mode. In passthrough mode, AVAssetWriter just writes the samples as it is given and then adds the input to the AVAssetWriter. Here is how to configure AVAssetWriter. You have to specify outputFileTypeProfile.

    You should specify Apple HLS or CMAF compliant profile to output the media data in fragmented MP4 format.

    You specify preferredOutputSegmentInterval.

    AVAssetWriter outputs the media data at that interval. This interval time can be a rational time, but target segment duration for HLS must be an integer in seconds, so we are not using greater precision here because this is the right choice for HLS. You also specify initialSegmentStartTime...

    which is the starting point of the interval.

    Then you specify a delegate object that implements a designated protocol.

    I will talk about the protocol later. After this, the same process as normal file writing continues.

    You start writing, then append the samples using AVAssetWriterInput.

    I'm not actually going to talk about this anymore in this session, but if you watch the WWDC session "Working with Media in AV Foundation" in 2011, you will see how you perform those processes. You can find the session video on Apple's Developer site. Here is the protocol for the delegate methods. There are two methods. Both delegate methods deliver NSData of a fragmented media data.

    They also deliver AVAssetSegmentType...

    that indicates a type of the fragmented media data.

    You should implement one of them. The difference is that the second one delivers an AVAssetSegmentReport object. That object contains the information about the fragmented media data. If you do not need such information, you should implement the first one.

    AVAssetSegmentType has two types...

    initialization and separable. Fragmented media data with type of initialization contains the file type box and the movie box. Fragmented media data with type of separable contains one movie fragment box and one media data box.

    You will receive data with a type of separable successfully.

    You package the media data you received into what is called a segment and write the segment to files.

    A pair of file type boxes and the movie box...

    should be an initialization segment.

    And the pair of movie fragment boxes and the media data box...

    can be one segment.

    For HLS, a single file that contains all of those segments can be used for streaming.

    Or each segment can be divided into multiple segment files.

    Moreover, multiple pairs of movie fragment boxes and the media data boxes can be one segment and stored in one segment file. HLS uses Playlist. Playlist is the centerpiece of how the client finds out about the segments.

    Here is an example of Playlist. There are several tags that indicate information about segments, such as URL and the duration. AVAssetWriter does not write the playlist. You should write the playlist. That is why one of the delegate methods delivers AVAssetSegmentReport. You can create the playlist and the I-frame playlist based on the information AVAssetSegmentReport provides.

    If you look at our sample code called fmp4Writer, you will see how to create the playlist. Also, the document about Playlist can be found here at the Developer streaming page. In passthrough mode, which is a mode where a sample is not encoded, AVAssetWriter outputs media data that includes all the samples just prior to the next sync sample after the preferred output segment interval has been reached or exceeded.

    The sync sample here is a sample that does not depend on another sample. This is because CMAF requires that every segment starts with a sync sample, and Apple HLS also prefers this.

    This rule applies not only to video, but also to audio that has sample dependencies, such as USAC audio. Therefore, for passthrough, only one AVAssetWriterInput can be added. If sync samples are not laid out near to the interval, the media data may be output in a considerably longer duration than what is specified. As a result, it will be unsuitable for HLS. One of the solutions is to encode video samples at the same time. As I mentioned earlier, if you specify output settings for compression in AVAssetWriterInput, samples will be encoded. In encoding mode, a video sample that just reaches or exceeds the preferred output segment interval will be forced to be encoded as a sync sample.

    In other words, sync samples are automatically generated so that the fragmented media data will be output at or very close to the specified interval. If you encode both audio and video, one AVAssetWriterInput for each media can be added. Earlier, I said that for passthrough, only one AVAssetWriterInput can be added. But you can deliver video and audio as separate streams using a master playlist. You can deliver not just one pair of audio and video streams, but also streams at various bit rates or in different languages.

    You need to create multiple instances of AVAssetWriter to create multiple streams. Again, the documents about master playlist and recommendations for encoding video and audio for HLS can be found here at the Developer streaming page.

    This is an advanced use case, but if you require different segmentation than preferredOutputSegmentInterval does, you can do it on your own. For example, you may want to output media data not at a sync sample after an interval has been reached, but before the interval has been reached. Here is the code I showed you earlier that set properties of AVAssetWriter. To signal custom segmentation, set preferredOutputSegmentInterval to CMTime indefinite. It is not necessary to set initialSegmentStartTime since there is not an interval. Other settings are the same as the settings that are used for preferredOutputSegmentInterval. This is only for passthrough. You cannot encode with custom segmentation. So when you create AVAssetWriterInput...

    you have to set nil to the output settings.

    You call the flushSegment method to output media data.

    FlushSegment outputs a media data that includes all samples appended since the previous call.

    You must call flushSegment prior to a sync sample so that the next fragmented media data can start with the sync sample.

    Otherwise, an error that indicates that a fragmented media data started with nonsync sample occurs.

    In this mode, you can mux audio and video. But if both audio and video have sample dependencies, it would make it difficult to align both media evenly. In that case, you should consider to package video and audio as separate streams.

    AVAssetTrack now has a new property that indicates whether the audio track has sample dependencies.

    You can check if a track has sample dependencies in advance using this property. I mentioned earlier that you should specify Apple HLS or CMAF compliant profile to output media data in fragmented MP4 format.

    CMAF is a set of constraints for how you construct fragmented MP4 segments.

    It is a standard for using fragmented MP4 for streaming.

    It is supported by multiple players, including Apple HLS. Some of the constraints are not required if you target just Apple HLS. But if you wanted a broader audience for your media library, then you could consider CMAF.

    Now I will talk about how HLS deals with audio priming. This diagram represents audio waveform of AAC audio.

    AAC audio codecs require data beyond the source PCM audio samples to correctly encode and decode audio samples due to the nature of the encoding algorithm. For this reason, the encoders add silence before the first true audio sample.

    This is called priming. The most common priming for AAC is 2,112 audio samples, which is just about 48 milliseconds, presuming that that sample rate is 44,100.

    When audio and video are written in separate files as separate streams, the media data are laid out like this...

    presuming that both media start at time zero. As you may have noticed, if audio and video start at the same time...

    the audio will be delayed by the priming so that the audio and the video will be slightly off.

    CMAF adopts edit list box in audio track to compensate for this issue. The priming is edited out...

    so that start time of audio and video match.

    Historically, Apple HLS hasn't used edit list, so if you specify Apple HLS profile, the edit list box is not used for compatibility with older players. Instead, baseMediaDecodeTime of audio media is shifted the amount of priming backward.

    However, the baseMediaDecodeTime cannot be earlier than zero since it is defined as an unsigned integer.

    One solution is to shift both audio and video forward by same time offset. This way, the baseMediaDecodeTime of audio media can be shifted the amount of priming backward.

    Even if start time is not zero, in HLS, playback starts at the earliest presentation time of video sample...

    so playback starts immediately without waiting for the time offset.

    This is important, not only for exact video and audio sync, but also for exact match of time stamps between audio variants with different priming duration. So we recommend you to shift media time by adding a certain amount of time to all samples if you specify Apple HLS profile.

    Initial segment start time should be shifted same time offset as well.

    As I said earlier, the most common priming is less than one second, so a couple of seconds is enough for the time offset. But Media File Segmenter, which is Apple's segmenter tool, has a ten-second offset, so you could choose the same ten seconds. Our sample code shows how to add a time offset to all samples. In this demo, I will use our sample command line application to create fragmented MP4 segment files and the playlist.

    Then I'll play the resulting content using Safari.

    This demo scene is set up as a local server in advance to host the content.

    I entered the command line application in the terminal application.

    This command line application reads media data from a premade source movie and encodes the video and the audio. This is the source movie. The length is just 30 seconds. This command line application uses six seconds as the preferred output segment interval. The segment files and the playlist will be created in this directory. Let's get started. The video frame rate of the source movie is 30 frames per second. You see five segment files and a playlist was created. I open the playlist.

    Each segment's duration is six seconds.

    I enter the local host as URL to play it using Safari.

    Let's play it.

    HLS has several requirements. It also has several recommendations to improve the experience for users. We recommend you to validate segment files and playlist that you write using AVAssetWriter to make sure that those files meet the requirements and recommendations. WWDC session "Validating HTTP Live Streams" in 2016 is a great source for this purpose. You can find the session video in Apple's Developer site.

    AVAssetWriter now outputs media data in fragmented MP4 format for HLS. Thank you for joining the session. [speaking Japanese]

    • 5:36 - Instantiate AVAssetWriter and input

      // Instantiate asset writer
      let assetWriter = AVAssetWriter(contentType: UTType(AVFileType.mp4.rawValue)!)
      
      // Add inputs
      let videoInput = AVAssetWriterInput(mediaType: .video, outputSettings: compressionSettings)
      			
      assetWriter.add(videoInput)
    • 6:28 - Configure AVAssetWriter

      assetWriter.outputFileTypeProfile = .mpeg4AppleHLS
      
      assetWriter.preferredOutputSegmentInterval = CMTime(seconds: 6.0, preferredTimescale: 1)
      
      assetWriter.initialSegmentStartTime = myInitialSegmentStartTime
      
      assetWriter.delegate = myDelegateObject
    • 8:00 - Delegate methods

      optional func assetWriter(_ writer: AVAssetWriter, didOutputSegmentData segmentData: Data, segmentType: AVAssetSegmentType)
      
      
      optional func assetWriter(_ writer: AVAssetWriter, didOutputSegmentData segmentData: Data, segmentType: AVAssetSegmentType, segmentReport: AVAssetSegmentReport?)
    • 8:37 - AVAssetSegmentType

      public enum AVAssetSegmentType : Int {
          case initialization = 1 
          case separable = 2
      }
    • 13:45 - Custom segmentation

      // Set properties
      assetWriter.outputFileTypeProfile = .mpeg4AppleHLS
      
      assetWriter.preferredOutputSegmentInterval = .indefinite
      
      assetWriter.delegate = myDelegateObject
      
      // Passthrough
      let videoInput = AVAssetWriterInput(mediaType: .video, outputSettings: nil)
    • 15:17 - Audio has dependencies

      extension AVAssetTrack {
             /* indicates whether this audio track has dependencies (e.g. kAudioFormatMPEGD_USAC) */
          open var hasAudioSampleDependencies: Bool { get }
      }

Developer Footer

  • Videos
  • WWDC20
  • Author fragmented MPEG-4 content with AVAssetWriter
  • Open Menu Close Menu
    • iOS
    • iPadOS
    • macOS
    • tvOS
    • visionOS
    • watchOS
    Open Menu Close Menu
    • Swift
    • SwiftUI
    • Swift Playground
    • TestFlight
    • Xcode
    • Xcode Cloud
    • SF Symbols
    Open Menu Close Menu
    • Accessibility
    • Accessories
    • App Extensions
    • App Store
    • Audio & Video
    • Augmented Reality
    • Design
    • Distribution
    • Education
    • Fonts
    • Games
    • Health & Fitness
    • In-App Purchase
    • Localization
    • Maps & Location
    • Machine Learning
    • Open Source
    • Security
    • Safari & Web
    Open Menu Close Menu
    • Documentation
    • Tutorials
    • Downloads
    • Forums
    • Videos
    Open Menu Close Menu
    • Support Articles
    • Contact Us
    • Bug Reporting
    • System Status
    Open Menu Close Menu
    • Apple Developer
    • App Store Connect
    • Certificates, IDs, & Profiles
    • Feedback Assistant
    Open Menu Close Menu
    • Apple Developer Program
    • Apple Developer Enterprise Program
    • App Store Small Business Program
    • MFi Program
    • News Partner Program
    • Video Partner Program
    • Security Bounty Program
    • Security Research Device Program
    Open Menu Close Menu
    • Meet with Apple
    • Apple Developer Centers
    • App Store Awards
    • Apple Design Awards
    • Apple Developer Academies
    • WWDC
    Get the Apple Developer app.
    Copyright ? 2025 Apple Inc. All rights reserved.
    Terms of Use Privacy Policy Agreements and Guidelines
    被银环蛇咬了有什么症状 5月份是什么星座 生命线分叉是什么意思 女人什么时候排卵 隔离是什么意思
    尿酸高是什么意思 糖粉和白糖有什么区别 对药物过敏是什么症状 丝光棉是什么面料 西安有什么山
    三高挂号挂什么科 男命食神代表什么 雅诗兰黛属于什么档次 打扰是什么意思 做梦吃鱼是什么意思
    钦点是什么意思 冰晶是什么东西 一个口一个且念什么字 人妖是什么 心发慌是什么原因
    烧心吃什么马上能缓解hlguo.com 巳蛇五行属什么hcv7jop7ns3r.cn 什么是提肛运动hcv8jop2ns7r.cn ast是什么意思hcv8jop5ns7r.cn 胃湿热吃什么中成药hcv8jop3ns7r.cn
    ns是什么意思hcv8jop7ns8r.cn 胸膜炎挂什么科hcv8jop6ns7r.cn 心灵的洗礼是什么意思hcv9jop1ns2r.cn 脸部爱出油是什么原因hcv9jop3ns7r.cn 宝石蓝是什么颜色hcv9jop6ns2r.cn
    陕西有什么烟helloaicloud.com 什么叫白内障hcv9jop1ns2r.cn 眼睛发涩是什么原因导致的hcv8jop7ns8r.cn 小腿骨头疼是什么原因hcv9jop0ns7r.cn 上分是什么意思hcv7jop9ns5r.cn
    鱼有念什么dajiketang.com 吃什么东西下火hcv8jop4ns1r.cn c肽测定是什么意思hcv8jop2ns0r.cn 什么人一年只工作一天hcv8jop7ns2r.cn 舌头两边锯齿状是什么原因hcv7jop9ns2r.cn
    百度