• <tr id="yyy80"></tr>
  • <sup id="yyy80"></sup>
  • <tfoot id="yyy80"><noscript id="yyy80"></noscript></tfoot>
  • 99热精品在线国产_美女午夜性视频免费_国产精品国产高清国产av_av欧美777_自拍偷自拍亚洲精品老妇_亚洲熟女精品中文字幕_www日本黄色视频网_国产精品野战在线观看 ?

    AN INFORMATIC APPROACH TO A LONG MEMORY STATIONARY PROCESS?

    2023-04-25 01:41:36丁義明

    (丁義明)

    College of Science, Wuhan University of Science and Technology, Wuhan 430081, China;Hubei Province Key Laboratory of System Science in Metallurgical Process,Wuhan University of Science and Technology, Wuhan 430065, China

    E-mail: dingym@wust.edu.cn

    Liang WU (吳量)

    Center of Statistical Research, School of Statistics, Southwestern University of Finance and Economics, Chengdu 611130, China

    E-mail: wuliang@swufe.edu.cn

    Xuyan XIANG (向緒言))?

    Hunan Province Cooperative Innovation Center for TCDDLEEZ, School of Mathematics and Physics Science, Hunan University of Arts and Science, Changde 415000, China

    E-mail: xyxiang2001@126.com

    Abstract Long memory is an important phenomenon that arises sometimes in the analysis of time series or spatial data.Most of the definitions concerning the long memory of a stationary process are based on the second-order properties of the process.The mutual information between the past and future Ip-f of a stationary process represents the information stored in the history of the process which can be used to predict the future.We suggest that a stationary process can be referred to as long memory if its Ip-f is infinite.For a stationary process with finite block entropy, Ip-f is equal to the excess entropy,which is the summation of redundancies that relate the convergence rate of the conditional (differential)entropy to the entropy rate.Since the definitions of the Ip-f and the excess entropy of a stationary process require a very weak moment condition on the distribution of the process,it can be applied to processes whose distributions are without a bounded second moment.A significant property of Ip-f is that it is invariant under one-to-one transformation;this enables us to know the Ip-f of a stationary process from other processes.For a stationary Gaussian process,the long memory in the sense of mutual information is more strict than that in the sense of covariance.We demonstrate that the Ip-f of fractional Gaussian noise is infinite if and only if the Hurst parameter is H ∈(1/2,1).

    Key words mutual information between past and future;long memory;stationary process;excess entropy;fractional Gaussian noise

    1 Introduction

    Long memory processes and long range dependence processes are synonymous notions[5,12,22]which play an important role in various fields,such as hydrology,geophysics,physics,finance,biology,medicine,climatology,environmental sciences,economics,telecommunications,etc..As was mentioned in[5],although long memory and related topics date from the late 19th century,it can properly be said that the notion only really started to attract the interest of a significant number of mathematical researchers(and,in particular,probabilists and statisticians)since the work of Mandelbrot and his colleagues,which laid the foundations for the fractional Brownian motion (FBM) model and its increments (such as the fractional Gaussian noise (FGN) model)– the classical models in the studies of long memory [5,28].Similar path-breaking roles can be attributed to Hurst[15]for hydrology,Dobrushin(and before,Kolmogorov[17])for physics,and Granger [13]for economics.

    A stationary process is a sequence of random variables whose probability law is time invariant.A stationary second-order moment process has long memory when the sum of the autocorrelation function (ACF) diverges,or there exists a pole at zero frequency of its power spectrum [4,5].That is to say,the ACF and the power spectrum of the long memory process both follow a power-law,while the underlying process has no characteristic timescale of decay.The correlation of the long memory process decays so fast that it cannot be distinguished from noise rapidly.This is in striking contrast to many standard stationary processes.The long memory phenomenon relates to the rate of decay of statistical dependence of a stationary process,with the implication that this decays more slowly than an exponential decay,which is typically a power-like decay.Some self-similar processes may exhibit long memory,but not all processes with long memory are self-similar[21].When the definitions of long memory are given,they vary from author to author(the econometric survey[14]mentions 11 different definitions).Different definitions of long memory are used for different applications.Most of the definitions of long memory that appear in the literature are based on the second-order properties of a stochastic process.Such properties include the asymptotic behavior of covariances,spectral density,and variances of partial sums.Reasons for the popularity of the second-order properties in this context are both historical and practical: second-order properties are relatively simple concepts and are easy to estimate from the data.

    Long memory is popularly defined from the aspect of a covariant stationary series{Xn}with spectral density via the divergence of the summation of autocovariance

    wherern=Cov(Xm,Xm+n).Conversely,for short memory.Note that correlations provide only limited information about the process if the process is “not very close” to being Gaussian,and rate of decay of correlations may change significantly after instantaneous one-to-one transformations of the process,though this is not valid for processes without a bounded second moment [22].The question then arises as to whether it is possible to develop a new approach to improve the definition of the long memory of a stationary process so that it is invariant under one-to-one transformation,can capture more dependence information,and can be used for the process without a bounded second moment.For stationary processes without a bounded second moment,some scholars use extreme value to describe long memory [11,19,20,22].Samorodnitsky suggested using notions from ergodic theory,including ergodic strong mixing,to describe the memory of a stationary process,because these are invariant under one-to-one transformation.The key step to this approach is to look for reasonable strong mixing conditions to significantly distinguish short and long memory stationary processes.Mixing coefficients [1,2]are also invariant under one-to-one transformation,but they are difficult to compute and still lack reasonable signs for distinguishing significantly long memory and short memory processes.It is well known that Shannon entropy is invariant under one-to-one transformation.One can expect to find suitable concepts in information theory for distinguishing short and long memory stationary processes.Mutual information is used to capture the dependence between two random variables that are independent if and only if the mutual information between those two random variables is zero.For a stationary processX={···,X-1,X0,X1,X2,···},we can regardXas two random variables: the past=···X-2X-1X0and the future=X1X2···.The mutual information betweenisIp-f(X),which represents the information between theof the stationary processX(see Section 3 for detailed definition regardingIp-f).Stationary processXusually admits a Shannon (differential) entropy ratehμ(X) (for continuous-valued stationary process,differential entropy rate may be-∞) and the conditional entropyH(Xn|X1,···,Xn-1)→hμ(X) (or the conditional differential entropyh(Xn|X1,···,Xn-1)→hμ(X)) asn →∞[6,16].In what follows,we try to demonstrate thatIp-fcan be used to distinguish long memory and short memory stationary processes: a stationary processXis long memory ifIp-f(X)=+∞,and it is short memory ifIp-f(X)<+∞.The mutual information description of long memory is also related to the ten key challenges of the post-Shannon era raised by Huawei [29].Such an approach is helpful for the following reasons:

    1) The definition of mutual informationIp-f(X)requires a weak moment condition rather than the second moment condition<+∞,therefore it can be used to detect the long memory behavior of a stationary process with heavy tail distribution;

    2)Ip-f(X)can distinguish short and long memory stationary processes asin a process with a bounded second moment (see Section 3 for details);

    3)Ip-f(X) is invariant under one-to-one transformation (Theorem 3.8);

    4) This is closely related to a second moment characterization if the stationary process is Gaussian (Theorem 3.11).For fractional Gaussian noise,Ip-f(X) is infinite if and only if the Hurst parameter isH ∈(1/2,1) (Theorem 3.13);

    5) For a stationary process with finite block entropy,Ip-f(X)is equal to the excess entropyE(X) ofX,which is an intuitive measure of information stored inX(Theorem 4.6) [7,27].WhetherIp-f(X) is finite or not is up to the convergence rate of the conditional (differential)entropy and the (differential) entropy ratehμ(X).

    The rest of this paper is organized as follows: in Section 2,we recall some basic concepts about information theory.In Section 3,we give the definition of long memory using mutual information for stationary processes,and show that the mutual information is invariant under one-to-one transformation.Furthermore,we illustrate howIp-fis related to covariance when the stationary process is Gaussian,and prove that,for fractional Gaussian noise,Ip-f=+∞if and only if the Hurst parameter isH ∈(1/2,1).In Section 4,we demonstrate that,for a stationary process with finite block entropy,Ip-fis equal to its excess entropy.

    2 Basic Quantities of Information Theory

    We recall some basic concepts and theorems about information theory from the books [6]and [16].

    The Shannon entropyH(X)of a discrete random variableX,taking valuesx ∈S,is defined as

    where the probability thatXtakes on the particular valuexis written asp(x)≡P(X=x).

    Shannon entropy measures the uncertainty of a discrete random variable.

    The joint Shannon entropyH(X,Y) of two discrete random variables (X,Y) is defined as

    wherep(x,y) is the joint distribution of (X,Y).

    The joint Shannon entropy ofndiscrete random variablesX1,X2,···,Xnis

    wherep(x1,x2,···,xn) is the joint distribution of (X1,X2,···,Xn).

    The conditional Shannon entropyH(X|Y),which is the entropy of a discrete random variableXthat is conditional on the knowledge of another discrete random variableY,is

    WhenXis a continuous random variable with densityf(x),the differential entropyh(X)ofXis defined as

    whereSis the support set of the random variable.

    The differential entropyh(X)may be negative and does not work as an uncertainty measure,but the difference ofh(X)-h(Y) indicates the difference between the uncertainties ofXandY.

    The joint entropy ofncontinuous random variablesX1,X2,···,Xnwith densityfis defined as

    Similarly,ifXandYare continuous random variables with a joint density functionf(x,y),the joint differential entropyh(X,Y) of (X,Y) is defined as

    and the conditional differential entropyh(X|Y) as

    For both Shannon entropy and differential entropy,we have the following properties[6,16]:

    1)H(X,Y)=H(X)+H(Y|X)=H(Y)+H(X|Y),h(X,Y)=h(X)+h(Y|X)=h(Y)+h(X|Y);

    2) Conditioning reduces entropy

    3) Chain rules

    A stochastic processX={Xi,i ∈Z} is an indexed sequence of random variables.A stochastic process is (strictly) stationary if the joint probability distribution does not change when shifted in time,i.e.,the distribution of (Xi1+s,Xi2+s,···,Xin+s) is independent ofsfor any positive integernandi1,i2,···,in ∈Z+.

    LetX={Xi,i ∈Z} be a stationary stochastic process.Forn=1,2,···,then-block entropy ofXisHX(n) :=H(X1,X2,···,Xn) ifXis a discrete-valued process,orhX(n) :=h(X1,X2,···,Xn) ifXis a continuous-valued process.For convenience,HX(n) (orhX(n)) is denoted byH(n) (orh(n)),if no confusion occurs,andH(0)=0 (orh(0)=0).

    We say that a stationary processXadmits finite block entropy if all of the block entropiesH(n) (orh(n)) are finite,i.e.,0≤H(n)<∞(or-∞

    In this paper,we focus on a stationary process with finite block entropy.

    The following lemma collects some properties of the block (differential) entropy sequence{H(n)} or{h(n)} of a stationary processX:

    Lemma 2.1LetX={Xn,n ∈Z} be a stationary process that admits finite block entropy.We have the following properties:

    1) Nonincreasing entropy gain ?H(n):=H(n)-H(n-1)=H(Xn|Xn-1,···,X1) and?h(n):=h(n)-h(n-1)=h(Xn|Xn-1,···,X1)) are nonincreasing;

    2) Subadditivity for all nonnegative integersmandn,

    3) Monotonicity of entropy per element both{H(n)/n}and{h(n)/n}are nonincreasing.

    ProofWithout loss of generality,we only prove the results for Shannon entropy;the differential entropy cases are similar.

    1) By the chain rules,we have thatIt follows that ?H(n) :=H(n)-H(n-1)=H(Xn|Xn-1,···,X1),which is nonincreasing due to the reduced entropy by condition.

    2) Sinceh(n) is nonincreasing,by the chain rule,we know that

    3) By the stationarity ofX,the chain rule and the nonincreasing entropy gain,we have that

    Remark 2.21) SinceH(Xn|Xn-1,···,X1)≥0,the entropy gain ?H(n)is nonnegative,but ?h(n)=h(Xn|Xn-1,···,X1) is not always nonnegative.

    2) IfXis a discrete-valued stationary process,Xadmits finite block entropy if and only ifH(X1)=H(1)<+∞,because the fact that{H(n)} is subadditive implies thatH(n)≤nH(1)<+∞.In particular,if E|X1|δ<+∞for someδ>0,thenH(X1)<+∞[3,9].ThusH(X1)<+∞is weaker than the second moment condition<+∞.For continuous-valued process,we should avoid the case ofh(n)=-∞,so the condition of finite block entropy is required.

    3) Suppose thatXis a continuous-valued stationary process,and thatXdoes not admit finite block entropy.Letkbe the minimal positive integerksuch thath(k)=-∞.Thenh(n)=-∞forn ≥k.In fact,by the subadditivity of{h(n)},h(mk)≤mh(k)=-∞indicates thath(mk)=-∞.Ifn=mk+s,m ≥1,0

    3 Long Memory and Mutual Information

    In this section,we give the definition of long memory by using mutual information,and discuss the rationale behind this definition.

    3.1 Definition of Long Memory

    The mutual information between two random variablesXandYis defined as follows [16]:

    Definition 3.1LetXandYbe random variables taking values in X and Y.LetμXandμYbe the probability measures ofXandYon the measurable spaces (X,B(X)) and(Y,B(Y)).μXYis the joint probability measure ofXandYon the measurable space (X×Y,B(X)×B(Y)).μX×μYis the product probability measure ofXandYon the measurable space (X×Y,B(X)×B(Y)).The mutual informationI(X;Y) betweenXandYis defined as

    ifμXY ?μX×μY(i.e.μXYis absolutely continuous with respect toμX×μY);otherwiseI(X;Y) is infinite.

    The mutual informationI(X;Y) is the relative entropy (or Kullback-Leibler divergence)between the joint probability measureμXYand the product probability measureμX×μY.

    Suppose thatXandYare two discrete random variables with a joint distributionp(x,y)and marginal distributionsp(x) andp(y),that the Radon-Nikodym derivative is given by

    and thatI(X;Y) can be written as

    It follows that

    For continuous random variablesXandYwith a joint density functionf(x,y)and marginal density functionsf(x) andf(y),

    LetP={P1,···,Pk} be a finite partition of the image of the random variableX.The quantization ofXwith partitionPis the discrete random variable [X]Pdefined by

    Thus the joint distribution of two quantizations [X]Pand [Y]Qis defined as

    whereQ={Q1,···,Qs} is a finite partition of the image of random variableY.

    An equivalent definition of mutual information using quantization[6]gives the relationship between the mutual information of discrete random variables and that of continuous random variables.

    Definition 3.2The mutual information between two random variablesXandYis defined as

    wherePandQmean finite partitions.

    We collect some useful properties about mutual information which immediately follow from Theorem 1.6.3 of [16].

    Lemma 3.3Suppose thatX,Y,Zare three random variables.Then the following properties hold:

    1)I(X;Y)=I(Y;X);

    2)I(X;Y)≥0;

    3)I(X;(Y,Z))≥I(X;Z).

    The mutual information of a stationary process between the past and future is defined as follows (see [18]for the stationary Gaussian process):

    Definition 3.4LetX={Xn,n ∈Z} be a stationary process with finite entropyH(n)(orh(n)) for eachn ∈Z+.LetIp-f(X,n) :=I(X-(n-1)···X0;X1···Xn) be the mutual information between the past and future with lengthn.The mutual information between the past and futureIp-f(X) is defined as

    For convenience,Ip-f(X) andIp-f(X,n) are denoted byIp-fandIp-f(n) as long as no confusion occurs.

    Remark 3.51) For convenience,we set thatIp-f(0)=0.

    2) By Lemma 3.3 and the stationarity ofX,Ip-f(n)=I(X1···Xn;Xn+1···X2n).This is nonnegative and nondecreasing,soIp-fis infinite or a nonnegative constant.

    3) The mutual information between the past and future can be also defined aswhereIp-f(X,n,m) :=I(X1···Xn;Xn+1···Xn+m).Following from the monotonicity of the mutual information (see the third claim of Lemma 3.3),this definition is equivalent to Definition 3.4.

    4) The definition of the mutual informationIp-frequires finite entropy,which only needs a weak moment condition rather than the second moment condition.For more details on this,for a random variableX,if E|X|δ<+∞for someδ>0,thenH(X)<+∞,since

    where Γ is the Gamma function [3,9,26].

    Now we distinguish the long memory and short memory a stationary process by usingIp-f.

    Definition 3.6A stationary process is long memory ifIp-fis infinite,and it is short memory ifIp-fis finite.

    The definition of long memory from the perspective ofIp-fwas discussed in the case of a stationary Gaussian process by Li [18].

    The information gainSnis defined as follows: forn ≥1,letSn:=?Ip-f(n)=Ip-f(n)-Ip-f(n-1) be thenth information gain.SinceIp-f(n) is non-decreasing,Sn ≥0.We have that

    In practice,Ip-fis usually approximated byIp-f(N) for some large positive integerN.In this case,the acquired information is,and the missed information is

    IfIp-f<+∞,asN →+∞,the ratio of the acquired information to the missed information is

    IfIp-f=+∞,no matter whatNis,the ratio of the acquired information to the missed information is

    One can see that there is a significant difference on the asymptotic behavior of{AMR(N)}between the stationary process withIp-f<+∞andIp-f=+∞.

    3.2 Invariance Under One-to-One Transformation

    In what follows,we show that the mutual informationIp-fis invariant under one-to-one transformation.

    Lemma 3.7Letg1andg2be two one-to-one transformations.XandYare two random variables.We have thatI(X;Y)=I(g1(X);g2(Y)).

    ProofLetP=(P1,···,Pk) andQ=(Q1,···,Qm) be partitions of the ranges of the random variablesXandY,respectively.Sinceg1andg2are one-to-one transformations,g1(P)=(g1(P1),···,g1(Pk)) andg2(Q)=(g2(Q1),···,g2(Qm)) are partitions of the ranges of random variablesg1(X) andg2(Y),respectively.

    Observe that,for alli ∈{1,2,···,k} andj ∈{1,2,···,m},

    By eq.(3.1) and eq.(3.2),we know that

    Take the supremum overPandQ,I(X;Y)≤I(g1(X);g2(Y)).

    By symmetry arguments,I(g1(X);g2(Y))≤I(X;Y).

    Lemma 3.7 is proven.

    Theorem 3.8LetX={Xn,n ∈Z} be a stationary stochastic process,and letgbe a one-to-one transformation.Then the mutual information between the past and futureIp-fof the processY:=g(X)={g(Xn),n ∈Z} is equal to that ofX.

    ProofSincegis a one-to-one transformation,by Lemma 3.7,we have that

    which implies thatIp-f(X,n)=Ip-f(Y,n) for alln ≥0.

    As a result,the stationary processesXandYadmit the sameIp-f.

    The significance of the definition of long memory lies in consistency: if there is a one-to-one correspondence between two stationary processes,then either both of them are long memory or neither of them is long memory.

    3.3 Stationary Gaussian Process

    In this subsection,we discuss the relationship between the definitions of long memory in the sense of mutual information and of covariance for the stationary Gaussian process.

    LetX={Xn,n ∈Z} be a zero-mean stationary Gaussian stochastic process.Xcan be completely characterized by its correlation functionrk,j=rk-j=E[XkXj],or equivalently by its power spectral densityf(λ),which is the Fourier transform of the covariance function

    Set that

    for any integerkif it is well defined.Here{bk} are referred to as cepstrum coefficients [18].

    The following Lemma 3.9 was proven by Li in [18]:

    Lemma 3.9LetX={Xn,n ∈Z} be a stationary Guassian process [18].

    1)Ip-fis finite if and only if the cepstrum coefficients satisfy the condition that<∞.In this case,

    2) If the spectral densityf(λ) is continuous,andf(λ)>0,thenIp-fis finite if and only if the autocovariance functions satisfy the condition that

    Lemma 3.10Let{c(n)} be a decreasing positive series.Then

    ProofSince{c(n)} is a decreasing positive series,converges{nc(n)}→0.Thus{nc(n)}is bounded by a constantM.Therefore,

    Remember that for a stationary Gaussian processXwith autocovariance{rn},Xis longmemory ifThe following result shows the relationship betweenandIp-ffor a stationary Gaussian process:

    Theorem 3.11LetX={Xn,n ∈Z} be a stationary Gaussian process with decreasing autocovariance{|rn|}.Suppose that the spectral densityf(λ) is continuous and thatf(λ)>0.Then

    In other words,the fact thatXis not long memory in the sense of covariance implies that it is also not long memory in the sense of mutual information.

    The proof of Theorem 3.11 immediately follows from Lemmas 3.9 and 3.10.

    Remark 3.12Setting thata(n)=,then we have thatbut also thatThe converse implication in Lemma 3.10 is not true.Thus for the process considered in Theorem 3.11,long memory in the sense of mutual information is stricter than for that in the sense of covariance.

    3.4 Fractional Gaussian Noise

    Fractional Brownian motion (FBM) has been widely applied to a large number of natural shapes and phenomena.An FBM with the Hurst parameterH ∈(0,1)is a centered continuoustime Guassian processBH(·) with the covariance function

    fors,t ≥0.BHreduces to an ordinary Brownian motion forH=1/2.

    The incremental process of an FBM is a stationary discrete-time process and is called fractional Gaussian noise(FGN).The auto-covariance function of FGNX={Xk:k=0,1,···}can be derived as follows:

    It is plain to see that

    as|k|→∞.Of course,ifH=1/2,thenρk=0 for allk ≥1 (a Brownian motion has independent increments).One can conclude that the summability of correlations<+∞) holds when 01/2 has become commonly accepted as having long memory,and a lack of the summability of correlations has became popular as part of the definition of long memory.

    The following result shows that long memory in the sense of mutual information is equivalent to that in the sense of covariance for FGN:

    Theorem 3.13LetX={Xn,n ∈Z} be the (discrete) increment process of a fractional Brownian motion with Hurst parameterH ∈(0,1) andH≠1/2.Then

    Remark 3.14This theorem shows that,for fractional Gaussian noise,the informatic characterization of long memory is identical to second moment characterization.

    ProofThe spectral density of the increment of fractional Brownian motionBH(t) (fractional Guassian noise) was obtained by Sinai [24]as

    where Γ(.) denotes the Gamma function and

    for-π ≤λ ≤π.

    This spectral density can be rewritten as

    whereCis a positive constant.

    It can be seen that the spectral density of FGN is positive.

    The spectral densityf(λ) is proportional to|λ|1-2Hnearλ=0.Thus,whenH ∈(0,1/2),f(λ) is continuous,but whenH ∈(1/2,1),it is not continuous atλ=0.

    Notice that ifH=1/2,the FGN is the increment of classical Brownian motion,and it follows thatIp-f=0.

    The theorem will be proven in two steps.

    ?Step 1:H ∈(0,1/2)=?Ip-f<∞.

    WhenH ∈(0,1/2),the spectral density of FGN forH ∈(0,1/2)is positive and continuous,and by eq.(3.4),Following on from Theorem 3.11,we have thatIp-f<∞.We have proven thatH ∈(0,1/2)Ip-f<∞.

    ?Step 2:H ∈(1/2,1)Ip-f=∞.

    Since the spectral density of FGN forH ∈(1/2,1) is not continuous whenλ=0,we prove it by the first claim of Lemma 3.9,which does not require continuous spectral density.

    Now we estimate the mutual information of FGN via the logarithm of the spectral density.We have that

    logf(λ) is an even function on [-π,π],i.e.,logf(λ)=logf(-λ).Forn≠ 0,we obtain the following decomposition:

    Forb1(n),we get that

    Thusb1(n)~asngoes to infinity.

    For the twice continuously differentiable functiong,the Fourier coefficient of ordernbehaves like[25].Observe that log[(1+g1(λ))/2]is twice differentiable,so there exists a positive constantM1<∞such that

    Now we estimateb3(n).Denote that

    We have that

    Since whenH>1/2,|x|2H-1,|x|2H,|x|2H+1,AH(x),(x),(x) are continuous functions on [-π,π],(x) are also continuous functions on [-π,π].We conclude that

    for some positive constantM2<∞.

    Combining the estimations ofb1(n),b2(n),b3(n),

    It follows that there exists a positive integerN0such that|b(n)|≥forn>N0.

    Hence,by Lemma 3.9,the mutual information is

    Remark 3.15From the proof of Theorem 3.13,it can be seen that,forH>1/2,|b(n)|~asn →+∞,which implies that

    4 Excess Entropy

    In this section,we try to relateIp-fwith excess entropy,which is an intuitive measure of memory stored in a stationary stochastic process,so that we can obtain a calculation ofIp-f.

    First we recall the definition of entropy rate of a stochastic processX={Xn} [16].

    Definition 4.1LetX={Xn,n ∈Z} be a stochastic process.

    1) Assume that eachXnis a discrete random variable.The Shannon entropy rate ofXis defined by

    when the limit exists.

    2) Assume that (X1,···,Xn) has a continuous joint distribution for eachn ∈Z+.The differential entropy rate ofXis defined by

    when the limit exists.

    It is known that for a stationary process with finite block entropy,the Shannon(differential)entropy rate always exists [6,16].In what follows,when we mention the entropy ratehμof a stationary processXwith finite block entropy,hμis a Shannon entropy rate ifXis a discretevalued process,and it is a differential entropy rate ifXis a continuous-valued process.

    IfXis a discrete-valued stationary process,by Lemma 2.1,?H(n)=H(n)-H(n-1) is nonnegative and nonincreasing.Then the limit of ?H(n) exists and is finite,it is equal to the entropy ratehμbecause

    IfXis a continuous-valued stationary process,by Lemma 2.1,?h(n)=h(n)-h(n-1)is nonincreasing and maybe negative.Because the entropy ratehμis equal to,hμexists and may be-∞[23].Details of the differential entropy rate can be found in [10,16].

    We show an example for different values ofhμin the case of a continuous-valued stationary process.

    Example 1Supposing that{Xn} is a stationary Gaussian process,we have the joint entropy [6]

    whereK(n)is the covariance matrix with elements of=r(i-j)=E(Xi-EXi)(Xj-EXj).Thus it is Toeplitz with entriesr(0),r(1),···,r(n-1) along the top row.The density of eigenvalues ofK(n) tends to the spectrum of the process asn →∞.It has been shown by Kolmogorov that the differential entropy rate of a stationary Gaussian process can be given by

    whereS(λ) is the power spectral density of the stationary Gaussian processX.On the other hand,

    A significant property of the entropy rate is the AEP (asymptotic equipartition property),also known as the the Shannon-McMillan-Breiman theorem,which states that ifhμis the entropy rate of a finite-valued stationary ergodic process{Xn},then

    with probability 1.The entropy ratehμquantifies the irreducible randomness in sequences produced by a stationary source;the randomness that remains after the correlations and structures in longer and longer sequence blocks are taken into account.

    For a discrete-valued stationary processX,by the definition of the entropy rate,

    However,the valuehμindicates nothing about howH(n)/napproaches this limit.Moreover,there may be sublinear terms inH(n).For example,one may haveH(n)~nhμ+corH(n)~nhμ+logn.The sublinear terms inH(n) and the manner in whichH(n) converges to its asymptotic form may reveal important structural properties about a stationary process.

    Definition 4.2LetX={Xn,n ∈Z} be a stationary process with finite block entropy,andhμbe the entropy rate ofX.Then,

    1) ifXis a discrete-valued process,the excess entropy ofXis

    2) ifXis a continuous-valued process,the excess entropy ofXis

    Remark 4.3This definition follows from [8].Note that for discrete-valued stationary process andn ∈Z+,

    Then{?H(n)-hμ}nis a monotonically nonincreasing and nonnegative sequence and converges to 0.Moreover,the excess entropy isE=+∞or a nonnegative constant.

    For continuous-valued stationary process andn ∈Z+,

    If the entropy ratehμ>-∞,{?h(n)-hμ}nis monotonically nonincreasing and nonnegative sequence and converges to 0.

    ?H(n)-hμ=H(Xn|X1,···,Xn-1)-hμis referred to as a per-symbol redundancyr(n),because it tells us how much additional information must be gained about the process in order to reveal the actual per-symbol uncertaintyhμ.In other words,the excess entropyEis the summation of per-symbol redundancy [8].Note that for a stationary process with finite block entropy and a finite entropy rate,the conditional entropy isH(Xn|Xn-1,···,X1)(orh(Xn|Xn-1,···,X1)) converges decreasingly tohμ),the excess entropyE<+∞if the rate of convergence is fast,andE=+∞if the rate of convergence is slow.Substituting?H(n)=(H(n)-H(n-1)) into the definition of the excess entropy,we know that

    If the excess entropy is finite,we obtain thatH(n)≈nhμ+Easn →∞.

    Using the notion of entropy rate,one can see that the past has little to say about the future.

    Proposition 4.4IfXis a stationary process with finite block entropy,and the entropy ratehμ>-∞,then

    Thus,the dependence between adjacentn-blocks of a stationary process with bounded entropy rate does not grow linearly withn.

    ProofSuppose thatXis a discrete-valued process.By definition,

    SinceXis stationary,we obtain thatIp-f(n)=2H(n)-H(2n).It follows that

    IfXis a continuous-valued process with bounded entropy ratehμ>-∞,the proof is similar.

    We give two examples for values of excess entropyE.

    Example 2For independent and identical distribution discrete-valued processes,the entropy rate ishμ=H(1),sinceH(n)=nH(1),and thus the excess entropy isE=0.

    Example 3For an irreducible positive recurrent Markov chainXdefined on a countable number of states,given the transition matrixPij,the entropy rate ofXis given by

    whereuiis the stationary distribution of the chain.By the Markovian property,the excess entropy of the Markov chain is

    Finally we discuss the relationship between the excess entropy and the mutual information.One useful Lemma is given below.

    Lemma 4.5Let{an,n ∈Z+} be a nonincreasing positive series and letSuppose thatThenAnis convergent if and only ifBnis convergent,and these have the same limit when they are convergent.

    Suppose thatBnis convergent.Noticing that

    it follows that?ε>0,?N>0,?n,s ∈N,n>N,

    As a result,

    We obtain,forn>N,that

    The lemma is proven.

    The following result shows that the excess entropy and the mutual information are identical for a stationary process with finite block entropy:

    Theorem 4.6LetX={Xn,n ∈Z} be a stationary process with finite block entropy.Then the excess entropy and the mutual information are identical:

    ProofWe prove the result for a discrete-valued stationary process and a continuousvalued process,respectively.

    First,we suppose thatXis a discrete-valued process.Denote that

    Denote thatan=?H(n)-hμ,and by the definition of entropy rate,we know thatan ≥0,an ≥an+1for a positive integern,and thatFurthermore,we have that

    We conclude that

    In fact,the first inequality follows from{an} is nonnegative,and the second inequality follows from the fact that{an} is nonincreasing.

    SinceEnis the partial summation of the nonnegative series{an},Enis nondecreasing.Notice that

    and that bothIp-f(n) andDnare nondecreasing.It follows that the following three limits exist:

    By (4.2) and Lemma 4.5,EnandDnare convergent at the same time,and have the same limit.Furthermore,we have that

    WhenEnandDnare not convergent at the same time,sinceEnandDnare nondecreasing,by (4.2),we have that

    The theorem is proven for a discrete-valued process.

    Now we suppose thatXis a continuous-valued stationary process.

    If the differential entropy ratehμis finite,i.e.,hμ>-∞,set that=?h(n)-hμ,n=1,2,···.Then{} is a nonnegative and nonincreasing sequence that converges to 0.By the same argument as for the discrete-valued process,one can show thatIp-f=E.

    If the differential entropy ratehμis infinite,i.e.,hμ=-∞,we have that=?h(n)-hμ=∞forn=1,2,···,becauseXadmits finite block entropy.By the definition of excess entropy,

    On the other hand,setting thatare the past and future ofX,by the third claim in Lemma 2,

    Since the differential entropy rate ishμ(X)=h(X1|)=-∞,we conclude that

    Hence,Ip-f=E=+∞.

    The proof for continuous-valued process is complete.

    Remark 4.71) The equalityE=Ip-fis claimed in [8]for a stationary process with a discrete state space;here an heuristic “proof” was also given.The proof is simple so is omitted here.

    2) The definition of excess entropyEdepends on the entropy ratehμ.The equationE=Ip-f=Dprovides two series to approximateIp-fand the excess entropy{2H(n)-H(2n)}n,as well as{nH(n-1)-(n-1)H(n)}n,which enables us to obtain the lower bound of the excess entropyEwithout knowing the entropy ratehμ.

    3) For a continuous-valued stationary process with finite block entropy,ifhμ=-∞,thenIp-f=E=+∞.This is always long memory.

    5 Conclusion

    The finiteness or infiniteness of mutual information between past and futureIp-fcan be regarded as a sign between the short memory and long memory stationary processes.For a stationary process with finite block entropy,Ip-fis the same as for the excess entropyE,which provides a good approximation ofIp-f.The definition ofIp-fand the excess entropy of a stationary process require a very weak moment condition on the distribution of the process,and can be applied to processes with distributions without a bounded second moment.A significant property ofIp-fis that it is invariant under one-to-one transformation.The invariance enables us to know theIp-fof a stationary process from theIp-fof other processes.Since conditional entropy can capture the dependence between random variables well,Ip-fand excess entropy are relevant for capturing the dependence of a stationary process whose distribution far from a Gaussian distribution.For stationary Gaussian processes,the long memory in the sense ofIp-fis a bit more strict than for that in the sense of covariance.For fractional Gaussian noise,theIp-f=∞if and only ifH ∈(,1).An important problem here is to provide an effective algorithm for approximatingIp-for the excess entropyE,which is essential in future of applications.It would also be interesting to use an informatic approach to consider the long memory behaviors of harmonizable processes and measure preserving transformations.

    Conflict of InterestYiming Ding is an editorial board member for Acta Mathematica Scientia and was not involved in the editorial review or the decision to publish this article.All authors declare that there are no competing interests.

    在线观看一区二区三区| 可以在线观看毛片的网站| 欧美xxxx黑人xx丫x性爽| 男女床上黄色一级片免费看| 在线免费观看的www视频| 国产男靠女视频免费网站| 五月伊人婷婷丁香| 香蕉av资源在线| 又粗又爽又猛毛片免费看| 亚洲自拍偷在线| 一区二区三区激情视频| 精品电影一区二区在线| 亚洲欧美精品综合一区二区三区| 精品国产亚洲在线| 久久欧美精品欧美久久欧美| 亚洲 欧美 日韩 在线 免费| 麻豆久久精品国产亚洲av| 国内少妇人妻偷人精品xxx网站 | 听说在线观看完整版免费高清| 色吧在线观看| 久久精品综合一区二区三区| 亚洲欧美一区二区三区黑人| 国模一区二区三区四区视频 | 老汉色∧v一级毛片| 国产三级中文精品| 中文字幕人成人乱码亚洲影| 两个人视频免费观看高清| 男女之事视频高清在线观看| 精品久久久久久久末码| 午夜福利视频1000在线观看| 夜夜爽天天搞| 亚洲aⅴ乱码一区二区在线播放| 成人午夜高清在线视频| 成人无遮挡网站| 香蕉国产在线看| 中文字幕精品亚洲无线码一区| 少妇的丰满在线观看| 国产视频内射| 99riav亚洲国产免费| 欧美成狂野欧美在线观看| 国产av不卡久久| 宅男免费午夜| 亚洲精品在线美女| 国产午夜福利久久久久久| 国产精品综合久久久久久久免费| 伊人久久大香线蕉亚洲五| av天堂在线播放| 97人妻精品一区二区三区麻豆| 亚洲成人精品中文字幕电影| 国产精品香港三级国产av潘金莲| 国产精品一及| 欧美黄色片欧美黄色片| 国产欧美日韩精品亚洲av| 露出奶头的视频| 成在线人永久免费视频| 亚洲成人中文字幕在线播放| 精品99又大又爽又粗少妇毛片 | 观看美女的网站| 欧美在线一区亚洲| 欧美色视频一区免费| 成年女人毛片免费观看观看9| 国产精品永久免费网站| 欧美日韩黄片免| 国产精品一区二区三区四区久久| 亚洲人成伊人成综合网2020| 久久久久国产精品人妻aⅴ院| 亚洲成人中文字幕在线播放| 最近最新免费中文字幕在线| 久久久精品大字幕| 97碰自拍视频| 淫秽高清视频在线观看| 成年女人毛片免费观看观看9| 少妇熟女aⅴ在线视频| 巨乳人妻的诱惑在线观看| 亚洲欧美日韩卡通动漫| 观看免费一级毛片| 国产高清视频在线播放一区| 免费大片18禁| 日韩欧美在线二视频| 亚洲专区国产一区二区| 国产高清视频在线观看网站| 99热6这里只有精品| 亚洲av成人一区二区三| 色在线成人网| 精品久久久久久久末码| 国产精品野战在线观看| 黄色成人免费大全| 757午夜福利合集在线观看| 制服丝袜大香蕉在线| av在线天堂中文字幕| 亚洲国产精品成人综合色| 男人和女人高潮做爰伦理| 日本撒尿小便嘘嘘汇集6| 最新美女视频免费是黄的| 中文字幕久久专区| 国产野战对白在线观看| 亚洲av第一区精品v没综合| 欧美极品一区二区三区四区| 亚洲电影在线观看av| 亚洲国产欧洲综合997久久,| 很黄的视频免费| 亚洲一区二区三区不卡视频| 99久久精品热视频| 国产精品美女特级片免费视频播放器 | 中文字幕最新亚洲高清| 久久久久久人人人人人| 一个人免费在线观看电影 | 天堂动漫精品| 国内精品久久久久久久电影| 亚洲色图av天堂| 1024香蕉在线观看| 日韩有码中文字幕| 成人国产综合亚洲| 丰满人妻一区二区三区视频av | 美女高潮的动态| 在线十欧美十亚洲十日本专区| 一a级毛片在线观看| 免费看十八禁软件| 国产成人精品久久二区二区免费| 男人舔女人下体高潮全视频| 亚洲国产色片| 国产成人精品久久二区二区免费| 亚洲美女视频黄频| 熟女人妻精品中文字幕| 亚洲欧美日韩东京热| 日本 av在线| 一本久久中文字幕| 麻豆一二三区av精品| 午夜影院日韩av| 成人午夜高清在线视频| 国产欧美日韩精品一区二区| 五月伊人婷婷丁香| 久9热在线精品视频| 天堂av国产一区二区熟女人妻| 亚洲乱码一区二区免费版| 亚洲av熟女| 一本精品99久久精品77| 女人高潮潮喷娇喘18禁视频| 嫁个100分男人电影在线观看| 日日夜夜操网爽| 国产av在哪里看| 两个人视频免费观看高清| 一本精品99久久精品77| 大型黄色视频在线免费观看| 亚洲精品在线美女| 听说在线观看完整版免费高清| 校园春色视频在线观看| 精品一区二区三区视频在线观看免费| 制服丝袜大香蕉在线| 亚洲第一电影网av| 99国产极品粉嫩在线观看| 久久久久久久精品吃奶| 丁香六月欧美| 亚洲片人在线观看| 无人区码免费观看不卡| 国产精品1区2区在线观看.| 长腿黑丝高跟| 精品久久久久久成人av| 又黄又爽又免费观看的视频| 国产伦一二天堂av在线观看| 18禁裸乳无遮挡免费网站照片| 日本在线视频免费播放| 亚洲色图av天堂| 全区人妻精品视频| 久久久久国产精品人妻aⅴ院| 人人妻,人人澡人人爽秒播| www日本黄色视频网| 99国产综合亚洲精品| 久9热在线精品视频| 国产精品电影一区二区三区| 欧美+亚洲+日韩+国产| 国产又色又爽无遮挡免费看| 午夜福利在线观看吧| 国产精品 欧美亚洲| 两个人看的免费小视频| 欧美绝顶高潮抽搐喷水| 亚洲国产精品sss在线观看| 国产精品自产拍在线观看55亚洲| 日本一二三区视频观看| 欧洲精品卡2卡3卡4卡5卡区| 亚洲av片天天在线观看| 亚洲国产精品sss在线观看| 18禁美女被吸乳视频| 色视频www国产| 一区二区三区国产精品乱码| 午夜两性在线视频| 午夜福利欧美成人| 久久中文字幕人妻熟女| 后天国语完整版免费观看| 99久国产av精品| 长腿黑丝高跟| 嫩草影院精品99| 香蕉国产在线看| 国产高清videossex| 欧美激情久久久久久爽电影| 国产成人系列免费观看| 国产精品精品国产色婷婷| 色视频www国产| 国产精品1区2区在线观看.| 国产午夜精品久久久久久| 亚洲第一欧美日韩一区二区三区| 18禁观看日本| 俺也久久电影网| 欧美日韩国产亚洲二区| 女生性感内裤真人,穿戴方法视频| 91麻豆av在线| av中文乱码字幕在线| 十八禁人妻一区二区| 欧美日本视频| 久久中文看片网| 桃色一区二区三区在线观看| 国产爱豆传媒在线观看| 欧美黑人巨大hd| 天堂av国产一区二区熟女人妻| 最新中文字幕久久久久 | 久久久久久久久中文| 午夜a级毛片| 可以在线观看的亚洲视频| 一本综合久久免费| 亚洲国产精品sss在线观看| 99国产精品99久久久久| 香蕉久久夜色| 欧美日韩综合久久久久久 | 日韩精品青青久久久久久| 日本免费a在线| 精品久久久久久成人av| 精品不卡国产一区二区三区| 香蕉丝袜av| 九色国产91popny在线| 天天躁狠狠躁夜夜躁狠狠躁| 99热6这里只有精品| 亚洲成人免费电影在线观看| 夜夜爽天天搞| 天天一区二区日本电影三级| 最近视频中文字幕2019在线8| 99久久成人亚洲精品观看| 久久久久亚洲av毛片大全| 18禁裸乳无遮挡免费网站照片| 国产伦在线观看视频一区| 美女黄网站色视频| 久久伊人香网站| 女人高潮潮喷娇喘18禁视频| 熟女人妻精品中文字幕| 身体一侧抽搐| www.www免费av| 欧美绝顶高潮抽搐喷水| or卡值多少钱| 国产欧美日韩精品一区二区| 亚洲国产欧洲综合997久久,| 法律面前人人平等表现在哪些方面| 国产精品影院久久| 亚洲成人久久爱视频| 麻豆成人av在线观看| 一个人免费在线观看电影 | 狠狠狠狠99中文字幕| 日日夜夜操网爽| 国产精品久久久人人做人人爽| aaaaa片日本免费| 亚洲 欧美 日韩 在线 免费| 国模一区二区三区四区视频 | 桃红色精品国产亚洲av| 少妇熟女aⅴ在线视频| 久久久久久久午夜电影| 免费观看精品视频网站| 88av欧美| 观看免费一级毛片| 欧美午夜高清在线| 国内久久婷婷六月综合欲色啪| 午夜福利18| 久久久国产成人免费| h日本视频在线播放| 国内毛片毛片毛片毛片毛片| 中文字幕av在线有码专区| 久久中文看片网| 老司机福利观看| 99国产极品粉嫩在线观看| 好男人在线观看高清免费视频| 婷婷亚洲欧美| 亚洲av熟女| 韩国av一区二区三区四区| 变态另类丝袜制服| www.熟女人妻精品国产| 国内揄拍国产精品人妻在线| 午夜福利成人在线免费观看| av中文乱码字幕在线| 国产真实乱freesex| 女同久久另类99精品国产91| 69av精品久久久久久| 18禁美女被吸乳视频| 看黄色毛片网站| 88av欧美| 亚洲av成人精品一区久久| 国产又色又爽无遮挡免费看| 精品午夜福利视频在线观看一区| 村上凉子中文字幕在线| 久久久久性生活片| 嫩草影视91久久| 美女cb高潮喷水在线观看 | 午夜精品久久久久久毛片777| 三级男女做爰猛烈吃奶摸视频| 亚洲七黄色美女视频| 黑人操中国人逼视频| 长腿黑丝高跟| 精品国内亚洲2022精品成人| 老汉色av国产亚洲站长工具| 国内精品久久久久精免费| 精品国产美女av久久久久小说| 成人三级做爰电影| 一a级毛片在线观看| 国产单亲对白刺激| 亚洲av成人不卡在线观看播放网| 亚洲人与动物交配视频| 又粗又爽又猛毛片免费看| 成年人黄色毛片网站| 一级黄色大片毛片| 麻豆av在线久日| 激情在线观看视频在线高清| 国产黄色小视频在线观看| 午夜日韩欧美国产| 欧美性猛交黑人性爽| 中文字幕久久专区| 床上黄色一级片| 天堂网av新在线| 色吧在线观看| 啪啪无遮挡十八禁网站| 久久精品国产清高在天天线| a级毛片在线看网站| 亚洲在线自拍视频| 日本与韩国留学比较| 最近最新免费中文字幕在线| 亚洲 欧美一区二区三区| 丁香欧美五月| 亚洲成av人片免费观看| 亚洲无线观看免费| 亚洲激情在线av| 三级毛片av免费| 免费看光身美女| 免费高清视频大片| 狂野欧美激情性xxxx| 国产一级毛片七仙女欲春2| 最近视频中文字幕2019在线8| 午夜久久久久精精品| 日本与韩国留学比较| 一本久久中文字幕| 国产午夜精品论理片| 日本与韩国留学比较| 男人舔奶头视频| ponron亚洲| 久久精品综合一区二区三区| 欧美色欧美亚洲另类二区| 可以在线观看的亚洲视频| 国内精品一区二区在线观看| 国产蜜桃级精品一区二区三区| 精品无人区乱码1区二区| 岛国在线观看网站| 老汉色av国产亚洲站长工具| 国产成人欧美在线观看| 亚洲国产精品合色在线| 久久中文看片网| 免费在线观看成人毛片| 免费av不卡在线播放| 最近视频中文字幕2019在线8| 精品久久久久久久久久久久久| 国产三级中文精品| 欧美成人一区二区免费高清观看 | 村上凉子中文字幕在线| 免费观看的影片在线观看| 欧美绝顶高潮抽搐喷水| 一二三四社区在线视频社区8| 久久久国产成人精品二区| 亚洲一区二区三区色噜噜| 欧美在线一区亚洲| 9191精品国产免费久久| 日韩三级视频一区二区三区| 亚洲精品乱码久久久v下载方式 | 久久精品aⅴ一区二区三区四区| 别揉我奶头~嗯~啊~动态视频| 国产激情欧美一区二区| 一个人看视频在线观看www免费 | 波多野结衣高清无吗| 免费搜索国产男女视频| 人妻丰满熟妇av一区二区三区| 女人高潮潮喷娇喘18禁视频| 又大又爽又粗| 国产伦一二天堂av在线观看| 老汉色av国产亚洲站长工具| 国产激情久久老熟女| 嫩草影视91久久| a在线观看视频网站| 一本综合久久免费| 亚洲av成人av| 日本在线视频免费播放| 男女午夜视频在线观看| 香蕉国产在线看| 老司机在亚洲福利影院| 极品教师在线免费播放| 国产成人av激情在线播放| 最新中文字幕久久久久 | 不卡av一区二区三区| 国产精品日韩av在线免费观看| 国产高清videossex| 欧美av亚洲av综合av国产av| 国产成人aa在线观看| 亚洲专区国产一区二区| 久久亚洲真实| 九九在线视频观看精品| 午夜精品久久久久久毛片777| 999久久久国产精品视频| 国产麻豆成人av免费视频| 国产av在哪里看| 女人被狂操c到高潮| 国产一区二区在线观看日韩 | 国产av麻豆久久久久久久| 亚洲无线在线观看| 国产免费男女视频| 色精品久久人妻99蜜桃| 成年女人看的毛片在线观看| 无人区码免费观看不卡| 美女免费视频网站| 亚洲欧洲精品一区二区精品久久久| 精品乱码久久久久久99久播| 日日夜夜操网爽| 一个人免费在线观看的高清视频| 少妇熟女aⅴ在线视频| 亚洲精华国产精华精| 精品国产亚洲在线| 中文字幕精品亚洲无线码一区| 午夜福利18| 一卡2卡三卡四卡精品乱码亚洲| 国产主播在线观看一区二区| 亚洲在线观看片| 色综合站精品国产| 两性午夜刺激爽爽歪歪视频在线观看| 午夜福利在线观看吧| 国产极品精品免费视频能看的| 欧美日韩一级在线毛片| 亚洲乱码一区二区免费版| 亚洲av电影不卡..在线观看| 亚洲av美国av| 亚洲成av人片在线播放无| 欧美成人性av电影在线观看| 精品不卡国产一区二区三区| 亚洲成人久久爱视频| 免费观看人在逋| 亚洲一区二区三区不卡视频| 国产精品久久久久久人妻精品电影| 九九在线视频观看精品| bbb黄色大片| 国产野战对白在线观看| 日本在线视频免费播放| 久久久水蜜桃国产精品网| 久久久久久久久中文| 18禁美女被吸乳视频| 又紧又爽又黄一区二区| 亚洲成人久久爱视频| 日韩成人在线观看一区二区三区| 日本黄大片高清| 又大又爽又粗| 97超视频在线观看视频| 免费一级毛片在线播放高清视频| 久久久久久久久久黄片| 伦理电影免费视频| 制服人妻中文乱码| 久久久久久九九精品二区国产| 夜夜爽天天搞| 一本综合久久免费| 国产一区二区三区在线臀色熟女| 看免费av毛片| avwww免费| www国产在线视频色| 国产高清视频在线播放一区| 黄片大片在线免费观看| 国产1区2区3区精品| 国产高清激情床上av| 免费看日本二区| 好男人电影高清在线观看| 亚洲精品久久国产高清桃花| 日韩欧美在线乱码| 亚洲精品美女久久久久99蜜臀| 中出人妻视频一区二区| 国产爱豆传媒在线观看| 午夜福利视频1000在线观看| 国产精品一区二区精品视频观看| 天堂动漫精品| 18禁美女被吸乳视频| 久久精品国产综合久久久| 欧美av亚洲av综合av国产av| 很黄的视频免费| 老汉色∧v一级毛片| 亚洲欧美一区二区三区黑人| 国产激情欧美一区二区| 国产免费av片在线观看野外av| 18禁黄网站禁片午夜丰满| 在线观看免费视频日本深夜| 天天躁日日操中文字幕| 看黄色毛片网站| 日韩欧美精品v在线| 久久久国产成人免费| 国产精品98久久久久久宅男小说| 黄色视频,在线免费观看| 欧美最黄视频在线播放免费| 欧美午夜高清在线| 精品久久久久久成人av| 亚洲成人精品中文字幕电影| 成人av在线播放网站| 麻豆成人av在线观看| 精品一区二区三区视频在线 | 久久精品国产清高在天天线| 亚洲av成人一区二区三| 免费看十八禁软件| 99国产极品粉嫩在线观看| 日本在线视频免费播放| 欧美性猛交黑人性爽| 老司机在亚洲福利影院| 黄色片一级片一级黄色片| av欧美777| 国产av麻豆久久久久久久| 国产成人精品无人区| 每晚都被弄得嗷嗷叫到高潮| 成熟少妇高潮喷水视频| 麻豆国产97在线/欧美| 日韩精品中文字幕看吧| 1024手机看黄色片| 97人妻精品一区二区三区麻豆| 成人18禁在线播放| 国产精品一及| 久久午夜综合久久蜜桃| 精品人妻1区二区| 波多野结衣高清作品| 欧美国产日韩亚洲一区| 久久这里只有精品中国| 欧美另类亚洲清纯唯美| 亚洲自偷自拍图片 自拍| 国产主播在线观看一区二区| 校园春色视频在线观看| 一级a爱片免费观看的视频| 国产日本99.免费观看| 在线a可以看的网站| 免费大片18禁| 亚洲av片天天在线观看| 一二三四社区在线视频社区8| 亚洲avbb在线观看| 亚洲成人中文字幕在线播放| 麻豆成人午夜福利视频| 国产精品国产高清国产av| 色播亚洲综合网| 老熟妇乱子伦视频在线观看| 日韩av在线大香蕉| 国产精品1区2区在线观看.| 18美女黄网站色大片免费观看| 亚洲va日本ⅴa欧美va伊人久久| 狠狠狠狠99中文字幕| 国产毛片a区久久久久| 一个人观看的视频www高清免费观看 | 精品电影一区二区在线| 丰满人妻熟妇乱又伦精品不卡| 亚洲狠狠婷婷综合久久图片| 成人特级黄色片久久久久久久| 久久久国产成人精品二区| 亚洲精品美女久久久久99蜜臀| 国产精品国产高清国产av| 国产精品美女特级片免费视频播放器 | 精品一区二区三区四区五区乱码| 丰满的人妻完整版| 中文资源天堂在线| 一本一本综合久久| 色噜噜av男人的天堂激情| 成人精品一区二区免费| 俺也久久电影网| 欧美av亚洲av综合av国产av| 欧美最黄视频在线播放免费| 母亲3免费完整高清在线观看| 国产精品影院久久| 亚洲人成网站高清观看| 18禁美女被吸乳视频| 国内少妇人妻偷人精品xxx网站 | 久久精品综合一区二区三区| 麻豆成人av在线观看| 国产精品亚洲av一区麻豆| 波多野结衣高清作品| 欧美不卡视频在线免费观看| 美女 人体艺术 gogo| 在线免费观看的www视频| 少妇熟女aⅴ在线视频| 狂野欧美白嫩少妇大欣赏| 久久天堂一区二区三区四区| 无限看片的www在线观看| 久9热在线精品视频| 男女做爰动态图高潮gif福利片| 亚洲国产欧美网| 欧美乱妇无乱码| 男女下面进入的视频免费午夜| 色综合婷婷激情| 国产亚洲精品久久久com| 巨乳人妻的诱惑在线观看| 成人精品一区二区免费| 午夜福利视频1000在线观看| 国产一区二区在线观看日韩 | 国产精品99久久99久久久不卡| 国产视频内射| 哪里可以看免费的av片| 又紧又爽又黄一区二区| 狂野欧美白嫩少妇大欣赏| 亚洲国产精品sss在线观看| 国产高清videossex| 免费在线观看影片大全网站| 听说在线观看完整版免费高清| 成人高潮视频无遮挡免费网站| 51午夜福利影视在线观看| av福利片在线观看| 可以在线观看的亚洲视频| 亚洲av成人不卡在线观看播放网| 亚洲精品456在线播放app | 久久久国产精品麻豆| 国产伦在线观看视频一区|