• <tr id="yyy80"></tr>
  • <sup id="yyy80"></sup>
  • <tfoot id="yyy80"><noscript id="yyy80"></noscript></tfoot>
  • 99热精品在线国产_美女午夜性视频免费_国产精品国产高清国产av_av欧美777_自拍偷自拍亚洲精品老妇_亚洲熟女精品中文字幕_www日本黄色视频网_国产精品野战在线观看 ?

    A 3D morphometric perspective for facial gender analysis and classification using geodesic path curvature features

    2018-05-11 06:30:58HawraaAbbasYuliaHicksDavidMarshallAlexeiZhurovandStephenRichmond
    Computational Visual Media 2018年1期

    Hawraa Abbas()Yulia HicksDavid MarshallAlexei I.Zhurovand Stephen Richmond

    The Author(s)2017.This article is published with open access at Springerlink.com

    1 Introduction

    Gender identification plays a remarkable role in social communication.Humans find this task relatively easy.They are remarkably accurate at determining the gender of subjects from their facial appearance.Even with altered hairstyle,removal of men’s facial hair,and no cosmetic cues,humans can still determine subjects’genders from their faces with more than 95%accuracy[1–7].However,achieving similar accuracy in automatic gender classification using computers remains a challenge.It is crucial in many applications,for instance making human–computer interaction (HCI)more user friendly,conducting passive surveillance and access control,and collecting valuable statistics,such as the number of women who enter a store on a given day.Researchers have considered techniques for gender classification since the 1990s,when SexNet,the first automated system capable of gender recognition using the human face,was created[8].

    A particular topic of research,for more than two decades,has beenthe relationship between facial traits and gender classificationor face recognition.Enlow and Moyers[9]contended that men have wider and longer noses compared to women,and that the male forehead is more bowed and slanting than the female forehead,while Shepherd[10]argued that the female nose is less pointed than the male nose.Another interesting study highlighted the relation between face parts and face recognition rate;the authentication score is obtained by combining the Surface Interpenetration Measure values corresponding to four different face regions:circular and elliptical areas around the nose,forehead,and the entire face region.Establishing which parts of the face and facial morphology features are most effective for gender classification remains an open research topic due to the strong dependency on ethnicity and age.

    There are two main sources of information for gender analysis:the shape and appearance of the face[5]. 3D facial images are rich in shape related information but there are difficulties in capturing such images. In contrast,2D facial images are easy to capture but poor in shape related information. Two studies[12,13]going back to 1997 and 1986,respectively,argued that geometric features are superior to textural features for the identification of visually derived semantic information such as gender,age,and expression.Geometric features of faces are usually defined with landmarks;for example,Farkas[14]annotated a set of 23 anthropometric facial landmarks to extract a set anthropometric facial measurements(Euclidean distances,proportions,and angles). The present study uses Farkas landmarks to define geometric features relevant to gender variations.

    This paper proposes new geodesic geometric features for gender analysis;specifically,these are derived from mean and Gaussian curvatures,shape indices,and curvedness calculated along thegeodesic pathbetween two landmarks.The determination of such features along geodesic paths is novel.We conduct a thorough investigation of the utility of our new features and into which parts of the face are the most effective for gender discrimination.

    Direct Euclidean and geodesic distance measures between facial landmarks are quite common as local geometric gender classification features(e.g.,Refs.[15,16]). In the current study,shape features derived from 3D geodesic paths between facial landmarks are used as descriptors to classify gender;we use as a dataset the extensive Avon Longitudinal Study of Parents and Children(ALSPAC)teenage face database. The mean curvature,Gaussian curvature,shape index,and curvedness are utilised to gain maximum benefit;these features have shown good results in facial morphology classification in the past.For example,mean and Gaussian curvature features have been utilised to classify philtrum morphology[17].Shape index and curvedness features have been applied in a wide range of 3D face recognition applications[18–22],and provide good classification results,ranging from 77%to 99%accuracy,depending on the dataset’s complexity and the algorithms employed.Zhao et al.,in Ref.[23],used a geodesic network generated for each face with predetermined geodesics and iso-geodesics;they then computed the mean curvature,Gaussian curvature,shape index,and curvedness for each network point. The authors then utilised these features for automated 3D facial similarity measurement.

    Furthermore,geodesic distances and curves have been utilised extensively in face recognition systems for faces with different poses and expressions(e.g.,Refs.[24,25]),and even in video processing(e.g.,Ref.[26]).Since these features have shown robust results in previous studies on 3D facial applications,our current work bases its new descriptors on acombination of these features along a geodesic path,to achieve better results.

    The new combinations of 3D geodesic path features were assessed in a gender classification application using the ALSPAC dataset containing 4745 3D facial meshes of fifteen-year-olds.This is a challenging dataset,as gender discrimination in young subjects is much more difficult than in adults.The results were then compared with the gender classification results for the same dataset obtained using the method in Ref.[27];our approach was found to improve classification accuracy by over 8%.An important part of our research was determining the most discriminative parts of the face for gender classification.Nose morphology was found to be most discriminative for teenage Caucasian populations.

    2 Related work

    It is logical to focus on biologically significant landmarks in order to extract features for facial gender classification,since gender is a biological characteristic.Facial landmarks can be divided into three broad categories[28]:biological landmarks,mathematical landmarks,and pseudo-landmarks.Biological landmarks,which are often used by scientists and physicians,are meaningful points that are defined as standard reference points on the face and head,such as the pupil,dacryon,nasion,or pogonion.Mathematical landmarks are defined according to certain mathematical or geometric properties of human faces,such as the middle point between two biological landmarks.Pseudo-landmarks are defined using two or more mathematical or anatomical landmarks or hair contours.

    Although the gender classification problem has been the subject of considerable research in recent years,current computer-based vision methods for facial gender recognition tend to overlook the use of facial biological landmarks as the basis for gender classification,despite their capability to efficiently classify gender with a minimum number of features when compared to the methods that use global 3D facial geometric features.For example,Ballihi et al.[24]used a large set of geometric-curve features(circular curves and radial curves)together with the Adaboost algorithm for feature selection to yield a gender classification rate of 86%on the FRGCv2 dataset.We exploit the usefulness of facial biological landmarks in our work,by introducing a novel 3D set of geometric features based on anthropometric landmarks,with the goal of gender classification and discovering of the relationship between facial morphology and gender.

    Gender classification and face recognition with landmark-based and simple geometric features was the subject of much research in the past.For example,Burton et al.[5]manually annotated 73 biological(anthropometric)landmarks for a dataset of 179 subjects,employing a total of 2628 Euclidean distance measurements.Due to limited computational capacity,the authors handpicked only 19 distances(and related ratios)and used these features,attaining a classification accuracy of 94%.Han et al.[29]utilised more intricate measures such as the volumes and areas of face portions to classify gender,but considered a small public data set of only 61 subjects.The authors used a support vector machine(SVM)classifier to classify the areas and volumes of five local craniofacial regions:the temple,eyes,nose,mouth,and cheeks.Using five-fold cross-validation,they reported 83%gender classification accuracy. Gilani et al.[15]extracted geodesic and Euclidean distances between 23 biological landmarks annotated manually for 64 3D facial meshes. Using these features,the authors proposed an approach that gave a gender classification accuracy for 3D faces of 90%.Toma[27]derived 250 facial parameters–90 Euclidean distances between landmarks,118 angles,and 42 Euclidean distance ratios–from the large ALSPAC dataset to predict gender with approximately 80%accuracy.

    Finding the relationship between face morphology and gender has also received some recent attention.For example,Brown and Perrett[30]reported results of such an investigation for a database of 32 photographs of male and female faces.Their results showed that the jaw,eyebrows,eyes,and chin contribute(in descending order)to gender perception.For 3D faces,Gilani et al.[15]and Toma[27]considered which parts of the face are most effective in gender discrimination when using distance measurements between anthropometric landmarks.In Ref.[27],the authors used Euclidean distance measures on ALSPAC facial meshes and found the nose ridge to be the most discriminative portion of the face for gender,which is also our finding in this paper. In Ref.[15],the authors used geodesic and Euclidean distances between anthropometric landmarks and found that the distances between the eyes and forehead landmarks are the most gender discriminative distances for 64 adult faces.

    It follows from the above literature survey that currently there are no semi-or fully automated gender classification methods that simultaneously satisfy the following requirements:(1)use surface geometric features dependent on 3D biological facial landmarks;(2)analyse which portions of a 3D face are the most discriminative between males and females;and(3)are validated on a large dataset.These three points gave us the motivation to propose a semi-automatic facial gender classification algorithm based on 3D facial morphology and biological landmarks.As a result,this paper proposes new 3D geometric features based on curvature measures calculated from the geodesic path between 3D facial landmarks.We show that these features provide better gender discriminating results than the current state-of-the-art methods,which is due to the improved capability of the included features to represent the shape of 3D facial surfaces.

    3 Dataset and methods

    The following section presents an overview of the dataset,landmarks,and tools that are used in this work.

    3.1 Dataset and landmarks

    The Avon Longitudinal Study of Parents and Children(ALSPAC)dataset is used in the present work.The ALSPAC study was designed to explore how an individual’s genotype is influenced by environmental factors impacting on health,behavior,and development of children.The initial ALSPAC sample was based on 14,541 pregnant women with an estimated delivery date between April 1991 and December 1992.Out of the initial 14,541 pregnancies,all but 69 had known birth outcomes;195 sets of twins,3 of triplets,and 1 of quadruplets represented the multiple births.13,988 children were alive at one year.Mothers were asked to complete postal questionnaires that covered a range of health outcomes[31].Ethical approval for using this data in the present study was obtained from the ALSPAC Ethics and Law Committee and the Local Research Ethics Committees[32].The cohort was re-called when the children were 15 years of age,and 3D face scans of their faces were obtained using two Konica Minolta Vivid 900 laser cameras[33].The final sample represented normal variation in 4747 British adolescents(2514 females and 2233 males);92%of these individuals were Caucasians,and the remaining subjects(8%)were a mixture of other ethnic groups[34].Each set of scanned images was imported into Rapidform 2006(a reverse engineering software package)and processed by removing noise and unwanted areas as well as the color texture in order to highlight morphological features,and eliminate the influence of dissimilar facial color tones.Then,21 3D facial landmarks were manually identified and recorded for each 3D facial image using the method of Toma et al.[35]. The biological landmark points for this dataset as well as their locations and meanings on the human face are shown in Fig.1,while Table 1 presents their definitions.

    Fig.1 Landmarks used in the present study.

    Table 1 Definitions of facial soft tissue landmarks

    3.2 Estimating curvature normal cycles

    A variety of algorithms based on the estimation of curvature tensors may be used to calculate curvatures on triangular meshes. For example,Taubin[36]developed a method callednormal cycles,in which the principal curvature directions can be approximated by two of the three eigenvectors,and the principal curvature magnitudes can be calculated by linear combinations of two of the three eigenvalues.This theory provides a unified,simple,and accurate way to determine curvatures for both smooth and polyhedral surfaces[37,38].

    The main idea of thenormal cycle theoryis that in order to acquire a continuous tensor field over an entire surface,a piecewise linear curvature tensor field should be calculated by estimating the curvature tensor at each vertex and then adding those values linearly across triangles. Figure 2 illustrates the main method used to calculate the curvature tensor for each vertex along the edgee,where for every edgeeof the mesh there is a minimum curvature(along the edge)and maximum curvature(across the edge).These line dense tensors can be averaged over an arbitrary mesh regionBaccording to the following equation:

    Fig.2 Curvature estimation using normal cycles.

    wherevrepresents the vertex position on the mesh,|B|is the surface area aroundvover which the curvature tensor is estimated,β(e)is the signed angle between the normal vectors of the two oriented triangles incident to edgee,|e∩B|is the length ofe∩B,andeis a unit vector in the same direction ase[39]. In practice,the normal cycle method is fast,and it provides excellent results,although the important issue of how the user should choose the neighbourhoodBthat approximates a geodesic disk around the vertexvstill remains.Altering the neighbourhood size can significantly affect the results:small neighbourhoods provide better estimates(and cleaner data),while an increase in the neighbourhood size smooths the estimates,reducing sensitivity to noise[37,40].The eigenvectors ofT(v)and their associated eigenvalue magnitudes are used to estimate curvatures at each vertex.The principal curvaturesk1andk2atvare estimated by the eigenvalues,while the eigenvectors represent the curvature directions[17,39].

    Mean(H)and Gaussian(G)curvatures are given in terms of the principal curvaturesk1andk2by

    The sign ofGindicates whether the surface is locally elliptic or hyperbolic[41].

    The shape indexSquantitatively measures the shape of a surface at a pointpand captures the intuitive notion of the local shape of a surface.Every distinct surface shape corresponds to a unique value ofS(except for planar shapes).The shape index for any surface point can be calculated in terms of the principal curvaturesk1andk2at that point as

    Another surface feature,called curvednessR,measures how much a surface is bent.Curvedness can define the scale difference between objects,such as the difference between a soccer ball and a cricket ball.This feature can be also calculated in terms of the principal curvatures as follows[42]:

    In general,shape index and curvedness are robust surface characteristics of a 3D image;they are invariant to changes in 3D image orientation.Figure3shows how variations in shape and curvedness,SandR,can be represented as polar coordinates within a Cartesian coordinate frame given by two principal curvatures(k1andk2).

    3.3 Geodesic path and distance

    Ageodesic pathis the shortest curve or path between two points on a surface;thegeodesic distancemay be defined as the length of this curve or path[44](see Fig.4). The computation of geodesic paths and distances on triangular meshes is a common problem in many computer graphics and vision applications. Several approaches may be used to compute geodesic distances and paths on triangular meshes;some are approximate,while others are exact.Exact methods include the Mitchell,Mount,and Papadimitriou(MMP)method[45],the Chen and Han(CH)method[46],the Xin and Wang method[47],and the heat method[48].Other exact methods,e.g.,see also Refs.[49–52],could also be considered,to determine if they influence gender classification accuracy positively.

    Fig.3 Shape index and curvedness represented in 2D space.The indices(S,R)are viewed as coordinates in the k1k2plane,with planar points mapped to the origin.The effects of variations in curvedness(radial coordinate)and shape index on the surface structure are shown.

    Fig.4 Euclidean and geodesic distances.(a)Geodesic distance is the shortest surface distance across the mesh between two landmarks,while(b)Euclidean distance is the straight line distance between two landmarks.

    We use the popular fast marching method[53].Although this is an approximate method,in practice the average error is below 1%,and the computational time and memory requirements are considerably lower than those of other methods.In addition,the method has been shown to work well with large 3D meshes[54].

    The fast marching method is a widely used algorithm in computer vision and computer graphics;for instance,it has been utilised to solve global minimisation problems for deformable models[55].This algorithm works as follows.Suppose we are given a metricM(s)dson some manifoldSsatisfyingM>0.If we have two points,r0∈Sandr1∈S,the weighted geodesic distance betweenr0andr1is defined as

    where they’s are all possible piecewise regular curves onSsuch thaty(0)=r0andy(1)=r1.Fixing the pointr0as the starting point,the distanceU(r)=d(S,r0,r)to all other points,r,can be computed by propagating the level set curveCt={r:U(r)=t}using the evolution equation?Ct/?t(r)=nr/M(r),wherenris the exterior unit normal toCtat the pointrandU(r)satisfies the nonlinearEikonalequation[53,56]:

    3.4 Euclidean distance

    Many researchers have frequently used geodesic and Euclidean distances as features for 3D facial recognition,3D facial morphological analysis,and gender identification.In Ref.[14],3D Euclidean distance was used to measure the deviations of morphological facial traits from a normal face;these distances have been used also to delineate syndromes in Ref.[57].Some studies have shown,however,that geodesic distances are more appropriate for gender identification and for measuring levels of facial masculinity/femininity[15,58].The present work uses both Euclidean and geodesic distances as features for gender classification in order to compare their performance with the proposed geodesic curvature features(i.e.,mean curvature,Gaussian curvature,shape index,and curvedness for geodesic paths between landmarks).Figure 4 shows the difference between Euclidean and geodesic distances.

    3.5 LDA classifier

    The present study uses linear discriminant analysis(LDA)as a binary classifier to predict the gender of 4745 3D facial meshes,since this classifier is easy to implement and does not require the adjustment of any tuning parameters.LDA has been successfully used for gender classification in the past[15,59].In preliminary tests,we also found that LDA outperformed another popular classifier of choice,the support vector machine(SVM)[60].

    LDA attempts to maximise the ratio of between class scatter to within-class scatter. Suppose we havendimensional elements{x1,...,xn},N1of which belong toW1( first class),andN2belong toW2(second class). Then,to compute the linear discriminant projection for these two classes,the following steps should be followed.

    Calculate the class means:

    Next,calculate the class covariance matrices:

    Then the within-class scatter matrix is

    SW=S1+S2and the between-class scatter is

    The LDA projection is then obtained as the solution to the generalised eigenvalue problem:

    Gender recognition is then established based on the calculation of the Euclidean distance between the tested and extracted 3D facial feature descriptor after projecting onto the LDA space and the two class means,as well as the following projections onto the LDA space:μ?1=(W?)Tμ1andμ?2=(W?)Tμ2[15,58,61].

    3.6 Verification and validation

    Ann-fold cross-validation scheme was chosen to evaluate the capability of LDA in gender classification. A cross-validation procedure is generally recommended for data mining and machine learning when estimating a classifier’s performance,as well as to avoid over- fitting learning. In this scheme,the dataset is divided into equal subsets:one is used for testing,while the others are used to train the LDA classifier. This step was repeated for the other subsets so that they were all utilised as test samples.Three measures are used to access the performance of the LDA classifier:percentage accuracy,sensitivity,and specificity[62]:

    whereTPis the number of true positives(i.e.,LDA identifies as a man someone who was labeled as such),TNis the number of true negatives(i.e.,the classifier recognises as a woman someone who was labeled as such),FPis a false male classification,andFNis a false female classification.Accuracy indicates overall detection performance,sensitivity is defined as the capability of features to accurately identify a male,and specificity indicates the features’capability not to identify a false male.

    4 Gender analysis approach

    An overview of our gender analysis approach is in Fig.5,while different components of the algorithm are explained below.

    4.1 Preprocessing

    The current study used 4745 subjects(all British adolescents:2512 females and 2233 males)from the ALSPAC dataset;all 3D faces had neutral expressions with a frontal view.For each face,all 21 of the 3D landmarks were used to extract feature descriptors.Generalised Procrustes Analysis(GPA)was performed to register(align)all sets of 21 facial landmarks by removing translation and rotation[63],and the respective facial meshes were transformed accordingly; this step was required to reduce any errors due to different face positions during computation of geodesic distances.A smoothing(Laplacian) filter[64]was then used to reduce high frequency information(i.e.,spikes)in the geometry of the mesh,thus giving the mesh a better shape and distributing the vertices more evenly[65].

    4.2 Feature extraction

    4.2.1Extraction of geodesic paths

    Peyre’s MATLAB fast marching toolbox[66]was used to determine geodesic paths and geodesic distances between landmarks.To this end,a number of landmark pairs were selected following the recommendations of Toma[27];the same landmarks were used for calculation of respective Euclidean distances.Figure 6 illustrates the paths used for further analysis in four facial regions:forehead/eyes,nose,upper lip,and lower lip/chin. For each face,eight paths were used for the forehead/eyes region,nine paths for the nose region,ten paths for the upper lip region,and six paths for the lower lip/chin region. These paths were selected following the gender classification results of Toma[27],who found that only 24 distances provided gender discrimination efficiency of over 70%when considering 250 Euclidean inter-landmark distances in the ALSPAC facial dataset.

    Fig.5 Block diagram of the proposed gender analysis system using novel and traditional 3D geometric features.

    Fig.6 Geodesic paths used in the algorithm.The curvature features were obtained for surface points on these paths.Each face region has a different number of geodesic paths:(a)forehead/eyes paths;(b)nose paths;(c)upper lip paths;(d)lower lip/chin paths.

    4.2.2Curvature features

    Principal curvatures were first computed for all points of each path;the other features(mean curvature,Gaussian curvature,shape index,and curvedness)were then calculated from the principal curvatures as explained.Extraction of these features depends on the selection of the ring size,the size of the neighborhood(number of mesh layers)around the vertex used to calculate the curvature tensor.Ring size also affects the local surface smoothing(for curvature calculations).The ring size should be large if noise is present,but smoothing can also mask surface details.In the present study,a ring size of 2 was used.For more details on the principal curvature calculation algorithm,see Ref.[17].

    4.2.3Normalisation of curvature features

    Each geodesic path has a different number of nodes(vertices)for curvature calculation.To cope with this,a normalised histogram distribution was calculated for each path feature;for this purpose,the number of bins selected was 5,10,15,20,or 25,depending on the minimum number of nodes in a path across the entire sample.

    Let,...,denote the vertices of pathPkon facial meshkand let,,,anddenote,respectively,the mean curvature,Gaussian curvature,curvedness value,and shape index value evaluated at vertex(i=1,...,n).For each path,we choose a numberb=5,10,15,20,or 25 such thatb≤minn,where minnis the minimum number of vertices in all pathsPkacross the sample.After histogram normalisation withbbins,we get exactly 4bcharacteristic curvature features for pathPk:

    where the hat denotes the respective values resulting from histogram normalisation.Then we compose a feature descriptorDk=?mk,gk,ck,sk?consisting of 4bcomponents.

    In addition,we compose feature descriptors for each region of facekby concatenating its path descriptors:

    4.2.4CalculationofgeodesicandEuclidean distances

    For each face,33 geodesic distances and 33 Euclidean distances were calculated between the same landmarks utilised to extract the geodesic paths shown in Fig.6.The fast marching algorithm was used to compute geodesic distances.

    4.3 Classification

    The LDA classifier was used to determine gender using a five-fold validation process,as suggested in Ref.[67]for large datasets.Using this process,the 4745 facial meshes were first partitioned into five equally sized sets(folds); five iterations of training and validation were subsequently performed in such a way that within each iteration,a different fold of the data was withheld for validation,while the remaining four folds were used for training.

    5 Experiments

    Five computational experiments were designed for this study in order to determine an optimal set of features for gender classification;an investigation was also conducted on the influence of different parts of the face on gender classification accuracy within the ALSPAC dataset.Experiments 1 through 3 determined which facial features were best suitable for gender classification;it was also investigated which facial regions(eyes,forehead,chins,lips,and nose)were most important for the task.Experiments 1 and 2 used the traditional Euclidean and geodesic distances as classification features,while Experiment 3 utilised our novel feature descriptors.Experiment4 then examined the effect on the classification scores of combining the Euclidean distance,geodesic distances,and geodesic path curvature features. Finally,Experiment 5 sought the most discriminatory features for gender recognition in each facial region. The evaluation criterion for all experiments was the average gender-classification accuracy using fivefold cross-validation. In addition,sensitivity and specificity measures were also determined.A detailed explanation of these experiments now follows.

    5.1 Experiment 1:Euclidean distances

    We classified gender using 33 Euclidean distances extracted from the 21 biologically significant landmarks,as shown in Fig.6. Our algorithm classified 79.4%of faces correctly as either male or female. Table 2 shows the gender recognition accuracies as well as the sensitivities and specificities for different facial parts.The Euclidean distances were calculated as in the previous study[27]on the ALSPAC dataset,which used these as features for gender recognition.

    5.2 Experiment 2:geodesic distances

    The second experiment used geodesic distances to determine facial gender;many previous studies[16,68,69]suggested that geodesic distances may represent 3D models better than 3D Euclidean distances.Using the fast marching algorithm,33geodesic distances were calculated between a number of facial landmarks as shown in Fig.6. Gender classification results obtained with these features are shown in Table 3.

    Table 2 Gender classification results based on Euclidean distances

    5.3 Experiment 3:geodesic path curvature

    The previous two experiments utilised the Euclidean and geodesic distances as features for gender recognition,both of which are very common features for this task. In contrast,the third experiment uses our novel feature descriptors based on thegeodesic path curvatures.As explained in Section 4.1,our feature descriptors rely on selecting certain points of a geodesic path between landmarks(see Fig.6),followed by determining the mean curvature,Gaussian curvature,shape index,and features for those points. We used histogram normalisation to resolve the problem of variations in face size.Before applying the resulting feature descriptors,it was important to analyse which histogram bin size would provide the best results.Different bin sizes were tried to achieve the best classification accuracy.Figure 7 demonstrates the relation between classification accuracy and bin size for each facial region.

    As is clear from Fig.7,the optimal result was obtained using a bin size of 5 for both the nose and lower lip/chin regions,10 for the forehead/eyes region,and 15 for the upper lip region.With these bin sizes,the overall gender recognition accuracy was 87.3%,much higherthan achieved in Experiments 1 and 2.Table 4 shows the accuracy,sensitivity,and specificity obtained for all facial regions using the geodesic path curvature feature descriptors.

    Table 3 Gender classification results based on geodesic distances

    Fig.7 Gender classification score for different histogram bin sizes.

    Table 4 Gender classification results based on geodesic path curvatures

    5.4 Experiment 4:combination of features

    With the results of the previous experiments,we were able to rank the features according to their classification accuracy.The best result was achieved when geodesic path curvature features were used,while Euclidean distances provided the poorest result.In the fourth experiment,the robustness of the gender recognition performance was explored with the aid of different combinations of Euclidean distances,geodesic distances(after scaling using amin–max algorithm [70]),and the geodesic path curvature features. The total 3D facial gender-recognition rates are shown in Table 5.In general,combining different features improves the classification results.The best performance,88.6%accuracy,was achieved when geodesic distance and geodesic path curvature features were concatenated.

    5.5 Experiment 5:discrimination capability of landmarks

    Our gender classification approach relies on biological landmarks as basic points to determine different classification features.Certain landmarks were used for each facial region to select Euclidean distance,geodesic distances,and geodesic path curvatures following the recommendations of Toma[27]. The aim of this experiment was to determine the landmark pairs that define the best geodesic-curvature based features(asdescribed above) for gender discrimination.To achieve this aim,the paths connecting landmarks were ranked according to their individual classification accuracies obtained with the best combination of features(geodesic distance and geodesic path curvatures)from Experiment 4.Table 6 ranks inter-landmark paths for each face region,whileFig.8 shows the three geodesic paths with the highest rank for each region.

    Table 5 Results for different combinations of features for entire face

    Table 6 Landmark path rankings for gender discrimination provided by landmark pairs,using the LDA classifier and path feature descriptors.1 represents the highest rank,and so on

    Fig.8 Geodesic paths with the highest gender discrimination capability.

    6 General discussion

    The first three experiments in this study aimed to determine which facial features are most effective for gender classification.Usingonly3D Euclidean distance(Experiment 1),we found the gender classification accuracy to be 79.4%,which is well below human perception accuracy but close to the results of Toma[27].Experiment 2 demonstrated that geodesic distances between facial landmarks provide better gender identification; this is to be expected,since geodesic distance is a better measure of face shape than Euclidean distance.However,the classification accuracy determined using this measure was still below the human accuracy threshold.

    Our novel feature descriptors (geodesic path curvatures)were evaluated in Experiment 3 and produced a classification accuracy of 87.3%.The proposed geometric descriptor is an amalgamation of the mean curvature,Gaussian curvature,curvedness,and shape index at the vertices of the path and thus represents a richer description of the surface than simple Euclidean or geodesic distance measures,explaining the obvious improvement in classification accuracy.

    As shown in previous studies[15–17],using combinations of facial features can further improve classification accuracies.We explored the various combinations of Euclidean or geodesic distances with our new geodesic path features.We achieved a further improvement in gender classification accuracy(88.6%)using a combination of geodesic distances between landmarks and our geodesic path features.

    This result compares favourably with other methods for gender classification.To the best of our knowledge,this is the best published result based on an anthropometric landmark approach,and was achieved for acrediblylarge sample of 4745 facial meshes. Several studies[5,15,71]used geodesic distances,Euclidean distances,or their combination,as geometric gender classification features for both 2D or 3D facial images.The reported classification accuracies were generally higher than ours,but were achieved for much smaller sample sizes.Burton et al.[5]reported 96%accuracy for a sample of 179 faces,while Fellous[71]obtained 90%accuracy for 109 facial images.Gilani et al.[15]achieved 98.5%accuracy for 64 3D facial scans.

    Other studies have only utilised global facial features for gender classification.For example,Wu et al.[61]used raw shape from shading depth features to achieve a gender classification accuracy of 69.9%with the FRGCv1 dataset compris ing 200 subject faces.Lu et al.[72]obtained a gender classification rate of 85.4%using the vertices of a generic facial mesh fitted to the raw 3D data as a classification feature descriptor for the same FRGCv1 dataset.Ballihi et al.[24]achieved a classification accuracy of 86%using a combination of radial and circular curves as classification features,and specified curves on the nose,forehead,and cheeks as a compact signature of a 3D face for face recognition and gender selection.However,it should be noted that the authors used a relatively small sample of 466 subject faces.It should also be noted that none of the above global methods is suitable for the investigation of specific relationships between individual facial regions and gender classification accuracy,which was the aim of this work.The present study operated on alargepopulation cohort of 4745 fifteen-year-old Caucasian adolescents;thus,the gender recognition effectivity identified in this study is likely to be more robust than that of other studies based on smaller samples.

    Physiological and psychological research[1,5,73]supports the idea that facial and gender recognition in the human brain is based more on individual regions than on the whole face.For example,Edelman et al.[74]compared human performance against a computer model in the classification of gender in 160 adult individuals(80 males,80 females)using frontal facial images.The study revealed that humans were better than computers at discriminating females based on the upper face,whereas for males the human accuracy was better for the lower face.It was also highlighted that males have thicker eyebrows and larger noses and mouths than females.Several forensic and anthropometric studies have also shown that the female face,mouth,and nose are smaller than in males[14].

    Based on this information, the first three experiments conducted in the present study concentrated on using individual facial parts to determine the gender recognition capability.Figure 9 shows an annotated view comparing the classification performance for 3D facialparts foreach featuretype(Euclidean and geodesic distances and geodesic path curvatures).

    Fig.9 Classification performance achieved using different types of 3D geometric features,showing the most discriminative facial regions based on geodesic path curvatures and on Euclidean and geodesic distances.

    As can be seen from Fig.9,the nose is the most important facial area for gender discrimination in the ALSPAC dataset.In addition,the sensitivity and specificity of the results shown in Tables 2–4 identify the nasal morphological areas that are most effective for discriminating gender in young Caucasian people.

    This finding is in agreement with medical studies[75–77],which addressed changes in nasal shapes and sizes in groups of 11–17 year old subjects in relation to gender discrimination.These studies have found that nasal height and nasal bridge length become fully mature at 15 years of age in males and 12 years of age in females.

    After establishing a set of strong gender differentiating geometric features,we evaluated the discrimination capabilities of pairs of landmarks and their curvature features along the shortest geodesic path between them.We did this by finding prime determinants of classification accuracy using the LDA classification method. Such landmarks can then form the basis of a more efficient,focused selection of specific manual landmarks or even assist in developing a suitable directed automated landmark detection approach.

    Our results indicate that the landmarks that describe 3D facial profile curves are important in gender classification,as shown in Fig.8. These findings validate other studies that have relied solely on 3D profile curves.For example,Lei et al.[78]extracted the central vertical profile and the nasal tip transverse profile,and located the face feature points by analysing the curvatures of profiles to obtain ten 3D geometric facial features with a rank-1 accuracy of 98.8%using the ZJU-3DFED dataset and a rank-2 accuracy of 100%with the 3DFACE-XMU dataset.Also Ter Haar and Veltkamp[79]performed 3D face matching and evaluation using profile and contour of facial surface to achieve a mean average precision(MAP)of 0.70 and 92.5%recognition rate(RR)on the 3D face Shape Retrieval Contest dataset(SHREC’08)and an MAP of 0.96 and 97.6%RR on the University of Notre Dame(UND)dataset.

    Our analysis revealed that the path between the inner canthi of the eyes(enR-enL),the ala shape path(alL-prn-alR),and the Cupid’s bow path(cphL-ls-cphR)are the best characteristic paths for gender classification.These results have been corroborated by the results from previous studies that were conducted on the same ALSPAC dataset.Examples include Toma[27],who worked on whole 3D faces(PCA of a small set of anthropometric landmarks and Euclidean distance measures),and Wilson et al.[80],who worked only on the lower parts of 3D faces usingmanually identifiedregions.These studies identified approximately the same face regions,with less accurate classification.

    Finally, our analysis of the sensitivity and specificity results showed little difference between the above results for gender classification based on facial regions using the geodesic and Euclidean distances as well as the geodesic path curvature features with theexceptionof the nose trait. In general,our first four experiments yielded good specificity and sensitivity results,particularly Experiment 4,in which the geodesic distance and geodesic path curvature features were integrated;the sensitivity value was 0.87 and the specificity value was 0.9.

    7 Conclusions and future work

    This paper proposed a novel 3D geometric descriptor for effective gender analysis and discrimination.It utilisescurvature featuresdetermined fromgeodesic pathsbetween landmarks within a facial region.Five experiments were performed,exploring in detail some aspects of facial traits based on key anthropometric landmarks.The results show that geodesic path curvature features extracted between 3D facial landmarks have the capability to classify the gender of Caucasian teenagers with an accuracy of 87.3%. Combination of the new 3D geometric descriptor with classical distance measures resulted in the best classification accuracy of 88.6%.The hybrid geodesic path curvature features and geodesic distance demonstrated an improved capability not only in terms of accuracy but also in terms of sensitivity and specificity.The sensitivity and specificity results show a noticeable variation between Caucasian teenagers in terms of both female and male nose morphology.Finally,Experiment 5 demonstrated that the geodesic paths between certain facial landmarks were more discriminative for gender classification and were more significant in 3D facial-profile contours.The nose ala path,Cupid’s bow path,and the path between the inner canthi of the eyes were also shown to be significant.

    In future,this study will be extended to explore gender variations depending only on profile contours.We will also work on evaluating the robustness of our novel feature descriptors on a dataset with respect to moderate changes in facial expression and ethnicity.

    Acknowledgements

    We are extremely grateful to all of the families who took part in this study,the midwives for their help in recruiting them,and the whole ALSPAC team,which includes interviewers,computer and laboratory technicians,clerical workers,research scientists,volunteers,managers,receptionists,and nurses. The UK Medical Research Council and the Welcome Trust(Grant ref: 102215/2/13/2)and the University of Bristol provided core support for ALSPAC.This publication is the work of the authors and the first author Hawraa Abbas will serve as guarantor of the contents of this paper.

    References

    [1]Abdi,H.;Valentin,D.;Edelman,B.;O’Toole,A.J.More about the difference between men and women:Evidence from linear neural networks and the principal-component approach.PerceptionVol.24,No.5,539–562,1995.

    [2]Biederman,I.Recognition-by-components:A theory of human image understanding.Psychological ReviewVol.94,No.2,115–147,1987.

    [3]Brigham,J.C.;Barkowitz,P.Do“They all look alike?” The effect of race,sex,experience,and attitudes on the ability to recognize faces.Journal of Applied Social PsychologyVol.8,No.4,306–318,1978.

    [4]Bruce,V.;Burton,A.M.;Hanna,E.;Healey,P.;Mason,O.;Coombes,A.;Fright,R.;Linney,A.Sex discrimination:How do we tell the difference between male and female faces?PerceptionVol.22,No.2,131–152,1993.

    [5]Burton,A.M.;Bruce,V.;Dench,N.What’s the difference between men and women?Evidence from facial measurement.PerceptionVol.22,No.2,153–176,1993.

    [6]Malpass,R.S.;Kravitz,J.Recognition for faces of own and other race.Journal of Personality and Social PsychologyVol.13,No.4,330–334,1969.

    [7]O’Toole,A.J.;Peterson,J.;Deffenbacher,K.A.Another race effect for categorizing faces by sex.PerceptionVol.25,No.6,669–676,1996.

    [8]Golomb,B.A.;Lawrence,D.T.;Sejnowski,T.J.SEXNET:A neural network identifies sex from human faces.In: Proceedings of the Advances in Neural Information Processing Systems 3,Vol.1,2–8,1990.

    [9]Enlow,D.H.;Moyers,R.E.Handbook of Facial Growth.Philadelphia:WB Saunders Company,1982.

    [10]Shepherd,J.The face and social attribution.In:Handbook of Research on Face Processing.Young,A.W.;Ellis,H.D.Eds.Elsevier,289–320,1989.

    [11]Queirolo,C.C.;Silva,L.;Bellon,O.R.;Segundo,M.P.3D face recognition using simulated annealing and the surface interpenetration measure.IEEE TransactionsonPatternAnalysisandMachine IntelligenceVol.32,No.2,206–219,2010.

    [12]O’Toole,A.J.;Vetter,T.;Troje,N.F.;Blthoff,H.H.Sex classification is better with three-dimensional head structure than with image intensity information.PerceptionVol.26,No.1,75–84,1997.

    [13]Bruce,V.;Young,A.Understanding face recognition.British Journal of PsychologyVol.77,No.3,305–327,1986.

    [14]Farkas,L.G.Anthropometry of the Head and Face.Raven Pr,1994.

    [15]Gilani,S.Z.;Rooney,K.;Shafait,F.;Walters,M.;Mian,A.Geometric facial gender scoring:Objectivity of perception.PLoS ONEVol.9,No.6,e99483,2014.

    [16]Gupta, S.; Markey, M.K.; Bovik, A.C.Anthropometric 3D face recognition.International Journal of Computer VisionVol.90,No.3,331–349,2010.

    [17]Abbas,H.;Hicks,Y.;Marshall,D.Automatic classification of facial morphology for medical applications.Procedia Computer ScienceVol.60,1649–1658,2015.

    [18]Lu,X.;Jain,A.K.Multimodal facial feature extraction for automatic 3D face recognition.Technical Report MSU-CSE-05-22.Michigan State University,2005.

    [19]Lu,X.;Jain,A.K.Automatic feature extraction for multiview 3D face recognition.In:Proceedings of the 7th International Conference on Automatic Face and Gesture Recognition,585–590,2006.

    [20]Lu,X.;Jain,A.K.;Colbry,D.Matching 2.5D face scans to 3D models.IEEE Transactions on Pattern Analysis and Machine IntelligenceVol.28,No.1,31–43,2006.

    [21]Perakis,P.;Theoharis,T.;Passalis,G.;Kakadiaris,I.A.Automatic 3D facial region retrieval from multi-pose facial datasets.In: Proceedings of the Eurographics Workshop on 3D Object Retrieval,37–44,2009.

    [22]Xu,C.;Wang,Y.;Tan,T.;Quan,L.Automatic 3D facerecognition combining globalgeometric features with local shape variation information.In:Proceedings of the 6th IEEE International Conference on Automatic Face and Gesture Recognition,308–313,2004.

    [23]Zhao,J.;Liu,C.;Wu,Z.;Duan,F.;Zhang,M.;Wang,K.;Jia,T.3D facial similarity measure based on geodesic network and curvatures.Mathematical Problems in EngineeringVol.2014,Article ID 832837,2014.

    [24]Ballihi,L.;Amor,B.B.;Daoudi,M.;Srivastava,A.;Aboutajdine,D.Boosting 3-D-geometricfeatures for efficient face recognition and gender classification.IEEE Transactions on Information Forensics and SecurityVol.7,No.6,1766–1779,2012.

    [25]Li,X.;Jia,T.;Zhang,H.Expression-insensitive 3D face recognition using sparse representation.In:Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition,2575–2582,2009.

    [26]Zhang,L.;Chen,X.-Q.;Kong,X.-Y.;Huang,H.Geodesic video stabilization in transformation space.IEEE Transactions on Image ProcessingVol.26,No.5,2219–2229,2017.

    [27]Toma,A.M.Characterization of normal facial features and their association with genes.Ph.D.Thesis.Cardiff University,2014.

    [28]Shi,J.;Samal,A.;Marx,D.How effective are landmarks and their geometry for face recognition?Computer Vision and Image UnderstandingVol.102,No.2,117–133,2006.

    [29]Han,X.;Ugail,H.;Palmer,I.Gender classification based on 3D face geometry features using SVM.In:Proceedings of the International Conference on Cyber Worlds,114–118,2009.

    [30]Brown,E.;Perrett,D.I.What gives a face its gender?PerceptionVol.22,No.7,829–840,1993.

    [31]Henderson,J.;Sherriff,A.;Farrow,A.;Ayres,J.G.Household chemicals,persistent wheezing and lung function:Effect modification by atopy?European Respiratory JournalVol.31,No.3,547–554,2008.

    [32]Avon longitudinal study of parents and children.2017.Available at http://www.bristol.ac.uk/alspac/researchers data-access/data-dictionary.

    [33]Kau,C.H.;Richmond,S.Three-dimensional analysis of facial morphology surface changes in untreated children from 12 to 14 years of age.American Journal of Orthodontics and Dentofacial OrthopedicsVol.134,No.6,751–760,2008.

    [34]Fraser,A.;Macdonald-Wallis,C.;Tilling,K.;Boyd,A.;Golding,J.;Smith,G.D.;Henderson,J.;Macleod,J.;Molloy,L.;Ness,A.;Ring,S.;Nelson,S.M.;Lawlor,D.A.Cohort profile:The avon longitudinal study of parents and children:ALSPAC mothers cohort.International Journal of EpidemiologyVol.42,No.1,97–110,2013.

    [35]Toma,A.M.; Zhurov,A.; Playle,R.; Ong,E.; Richmond,S.Reproducibility of facial soft tissue landmarks on 3D laser-scanned facial images.Orthodontics&Craniofacial ResearchVol.12,No.1,33–42,2009.

    [36]Taubin,G.Estimating the tensor of curvature of a surface from a polyhedral approximation.In:Proceedings of the 5th International Conference on Computer Vision,902–907,1995.

    [37]Cohen-Steiner,D.;Morvan,J.-M.Restricted delaunay triangulations and normal cycle.In:Proceedings of the 19th Annual Symposium on Computational Geometry,312–321,2003.

    [38]Sun,X.;Morvan,J.-M.Curvature measures,normal cycles and asymptotic cones.Actes des rencontres du CIRMVol.3,No.1,3–10,2013.

    [39]Alliez,P.;Cohen-Steiner,D.;Devillers,O.;Lvy,B.;Desbrun,M.Anisotropic polygonal remeshing.ACM Transactions on GraphicsVol.22,No.3,485–493,2003.

    [40]Rusinkiewicz,S.Estimating curvatures and their derivatives on triangle meshes.In:Proceedings of the 2nd International Symposium on 3D Data Processing,Visualization and Transmission,486–493,2004.

    [41]Besl,P.J.;Jain,R.C.Invariant surface characteristics for 3D object recognition in range images.Computer Vision,Graphics,and Image ProcessingVol.33,No.1,33–80,1986.

    [42]Dorai,C.;Jain,A.K.COSMOS—A representation scheme for 3D free-form objects.IEEE Transactions on Pattern Analysis and Machine IntelligenceVol.19,No.10,1115–1130,1997.

    [43]Lappin,J.S.;Norman,J.F.;Phillips,F.Fechner,information, and shape perception.Attention,Perception,&PsychophysicsVol.73,No.8,2353–2378,2011.

    [44]Mpiperis,I.;Malassiotis,S.;Strintzis,M.G.3-D face recognition with the geodesic polar representation.IEEE Transactions on Information Forensics and SecurityVol.2,No.3,537–547,2007.

    [45]Surazhsky,V.;Surazhsky,T.;Kirsanov,D.;Gortler,S.J.;Hoppe,H.Fast exact and approximate geodesics on meshes.ACM Transactions on GraphicsVol.24,No.3,553–560,2005.

    [46]Chen,J.;Han,Y.Shortest paths on a polyhedron.In:Proceedings of the 6th Annual Symposium on Computational Geometry,360–369,1990.

    [47]Xin,S.-Q.;Wang,G.-J.Improving Chen and Han’s algorithm on the discrete geodesic problem.ACM Transactions on GraphicsVol.28,No.4,Article No.104,2009.

    [48]Crane,K.;Weischedel,C.;Wardetzky,M.Geodesics in heat:A new approach to computing distance based on heat fl ow.ACM Transactions on GraphicsVol.32,No.5,Article No.152,2013.

    [49]Liu,Y.-J.Exact geodesic metric in 2-manifold triangle meshes using edge-based data structures.Computer-Aided DesignVol.45,No.3,695–704,2013.

    [50]Xu,C.;Wang,T.Y.;Liu,Y.-J.;Liu,L.;He,Y.Fast wavefront propagation(FWP)for computing exact geodesic distances on meshes.IEEE Transactions on Visualization and Computer GraphicsVol.21,No.7,822–834,2015.

    [51]Ying,X.;Wang,X.;He,Y.Saddle vertex graph(SVG):A novel solution to the discrete geodesic problem.ACM Transactions on GraphicsVol.32,No.6,Article No.170,2013.

    [52]Ying,X.;Xin,S.-Q.;He,Y.Parallel Chen–Han(PCH)algorithm for discrete geodesics.ACM Transactions onGraphicsVol.33,No.1,Article No.9,2014.

    [53]Sethian,J.A.Level Set Methods and Fast Marching Methods:EvolvingInterfacesinComputational Geometry,Fluid Mechanics,Computer Vision,and Materials Science,Vol.3.Cambridge University Press,1999.

    [54]Zimmer,H.;Campen,M.;Kobbelt,L.Efficient computation of shortest path-concavity for 3D meshes.In:Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition,2155–2162,2013.

    [55]Cohen,L.D.;Kimmel,R.Global minimum for active contour models:A minimal path approach.International Journal of Computer VisionVol.24,No.1,57–78,1997.

    [56]Peyr,G.;Cohen,L.Geodesic computations for fast and accurate surface remeshing and parameterization.Elliptic and Parabolic Problems157–171,2005.

    [57]Aldridge,K.;George,I.D.;Cole,K.K.;Austin,J.R.;Takahashi,T.N.;Duan,Y.;Miles,J.H.Facial phenotypes in subgroups of prepubertal boys with autism spectrum disorders are correlated with clinical phenotypes.Molecular AutismVol.2,No.1,15,2011.

    [58]Gilani,S.Z.; Mian,A.Perceptual differences between men and women:A 3D facial morphometric perspective.In:Proceedings of the 22nd International Conference on Pattern Recognition,2413–2418,2014.

    [59]Bekios-Calfa,J.;Buenaposada,J.M.;Baumela,L.Revisiting linear discriminant techniques in gender recognition.IEEE Transactions on Pattern Analysis and Machine IntelligenceVol.33,No.4,858–864,2011.

    [60]Meyer,D.;Wien,F.T.Support vector machines.R NewsVol.1,No.3,23–26,2001.

    [61]Wu,J.;Smith,W.A.;Hancock,E.R.Facial gender classification using shape-from-shading.Image and Vision ComputingVol.28,No.6,1039–1048,2010.

    [62]Zhu,W.;Zeng,N.;Wang,N.Sensitivity,specificity,accuracy,associated con fi dence interval and ROC analysis with practical SAS implementations.In:Proceedings of NESUG:Health Care and Life Sciences,1–9,2010.

    [63]Gower, J. C. Generalized procrustes analysis.PsychometrikaVol.40,No.1,33–51,1975.

    [64]Peyre,G.Toolbox graph—A toolbox to process graph and triangulated meshes.Available at http://uk.mathworks.com/matlabcentral/ fi leexchange/5355-toolbox-graph/content/toolbox graph/html/content.html.

    [65]Drira,H.;Amor,B.B.;Daoudi,M.;Srivastava,A.Pose and expression-invariant 3D face recognition using elastic radial curves.In: Proceedings of the British Machine Vision Conference,1–11,2010.

    [66]Peyre, G. Fast marching MATLAB toolbox.Available at http://uk.mathworks.com/matlabcentral/if leexchange/6110-toolbox-fast-marching.

    [67]Gehrig,T.;Steiner,M.;Ekenel,H.K.Draft:Evaluation guidelines for gender classification and age estimation.Technical Report.Karlsruhe Institute of Technology,2011.

    [68]Bronstein,A.M.;Bronstein,M.M.;Kimmel,R.Three-dimensional face recognition.International Journal of Computer VisionVol.64,No.1,5–30,2005.

    [69]Hamza,A.B.;Krim,H.Geodesic matching of triangulated surfaces.IEEE Transactions on Image ProcessingVol.15,No.8,2249–2258,2006.

    [70]Raschka,S.About feature scaling and normalization.2014.Available at https://sebastianraschka.com/Articles/2014-about-feature-scaling.html#aboutstandardization.

    [71]Fellous,J.-M.Gender discrimination and prediction on the basis of facial metric information.Vision ResearchVol.37,No.14,1961–1973,1997.

    [72]Lu,X.;Chen,H.;Jain,A.K.Multimodal facial gender and ethnicity identification.In:Proceedings of the International Conference on Biometrics,554–561,2006.

    [73]Palmer,S.E.Hierarchical structure in perceptual representation.Cognitive PsychologyVol.9,No.4,441–474,1977.

    [74]Edelman,B.;Valentin,D.;Abdi,H.Sex classification of face areas:How well can a linear neural network predict human performance?Journal of Biological SystemsVol.6,No.3,241–263,1998.

    [75]Abdelkader,M.;Leong,S.;White,P.S.Aesthetic proportions of the nasal aperture in 3 different racial groups of men.Archives of Facial Plastic SurgeryVol.7,No.2,111–113,2005.

    [76]Akgner,M.;Barut?cu,A.;Karaca,C.Adolescent growth patterns of the bony and cartilaginous framework of the nose:A cephalometric study.Annals of Plastic SurgeryVol.41,No.1,66–69,1998.

    [77]Prahl-Andersen,B.;Ligthelm-Bakker,A.;Wattel,E.;Nanda,R.Adolescent growth changes in soft tissue profile.American Journal of Orthodontics and Dentofacial OrthopedicsVol.107,No.5,476–483,1995.

    [78]Lei,Y.;Lai,H.;Li,Q.Geometric features of 3D face and recognition of it by PCA.Journal of MultimediaVol.6,No.2,207–216,2011.

    [79]Ter Haar,F.B.;Veltkamp,R.C.A 3D face matching framework for facial curves.Graphical ModelsVol.71,No.2,77–91,2009.

    [80]Wilson,C.;Playle,R.;Toma,A.;Zhurov,A.;Ness,A.;Richmond,S.The prevalence of lip vermilion morphological traits in a 15-year-old population.American Journal of Medical Genetics Part AVol.161,No.1,4–12,2013.

    青春草视频在线免费观看| 日韩人妻精品一区2区三区| 后天国语完整版免费观看| 男女边吃奶边做爰视频| 曰老女人黄片| 久久精品熟女亚洲av麻豆精品| 色94色欧美一区二区| 亚洲美女黄色视频免费看| 国产亚洲欧美精品永久| 黄色视频在线播放观看不卡| 国产精品久久久人人做人人爽| 亚洲av日韩在线播放| 99久久精品国产亚洲精品| 久久青草综合色| 国产亚洲一区二区精品| 国产91精品成人一区二区三区 | 国产精品麻豆人妻色哟哟久久| 成人免费观看视频高清| bbb黄色大片| 午夜免费鲁丝| 久久久久久人人人人人| 成年人免费黄色播放视频| 两性夫妻黄色片| 国产精品一区二区在线不卡| 亚洲三区欧美一区| 国产精品二区激情视频| 婷婷色综合大香蕉| 黄片播放在线免费| av电影中文网址| 丝袜人妻中文字幕| 国产成人精品在线电影| 在线天堂中文资源库| 久热这里只有精品99| 国产淫语在线视频| 久久天堂一区二区三区四区| 51午夜福利影视在线观看| 久久精品国产a三级三级三级| 国产爽快片一区二区三区| 久久狼人影院| 欧美日本中文国产一区发布| 麻豆av在线久日| 午夜激情久久久久久久| 超碰成人久久| av欧美777| 高清黄色对白视频在线免费看| 狂野欧美激情性xxxx| 女人爽到高潮嗷嗷叫在线视频| 亚洲av在线观看美女高潮| 看十八女毛片水多多多| 日韩中文字幕视频在线看片| 亚洲精品一卡2卡三卡4卡5卡 | 日本91视频免费播放| 人体艺术视频欧美日本| 性少妇av在线| 成年人午夜在线观看视频| 国产成人系列免费观看| 伊人亚洲综合成人网| 欧美日韩亚洲国产一区二区在线观看 | 亚洲国产欧美一区二区综合| 一级毛片黄色毛片免费观看视频| 亚洲欧美成人综合另类久久久| 国产1区2区3区精品| 国产无遮挡羞羞视频在线观看| 久久天堂一区二区三区四区| 99精品久久久久人妻精品| 免费不卡黄色视频| av电影中文网址| 999精品在线视频| 欧美精品高潮呻吟av久久| 国产亚洲精品久久久久5区| xxx大片免费视频| www日本在线高清视频| 男女无遮挡免费网站观看| 国产成人精品久久二区二区91| 又粗又硬又长又爽又黄的视频| 女性被躁到高潮视频| 少妇精品久久久久久久| 亚洲精品自拍成人| 久久精品久久精品一区二区三区| 波多野结衣av一区二区av| 视频区欧美日本亚洲| 叶爱在线成人免费视频播放| 国产日韩欧美视频二区| 婷婷色av中文字幕| 午夜视频精品福利| 女人高潮潮喷娇喘18禁视频| 天天影视国产精品| 美女午夜性视频免费| 另类亚洲欧美激情| 午夜福利,免费看| 99re6热这里在线精品视频| 女性生殖器流出的白浆| 国产伦人伦偷精品视频| 国产一区二区三区av在线| 国产成人欧美在线观看 | 大话2 男鬼变身卡| 2018国产大陆天天弄谢| 欧美亚洲日本最大视频资源| 性高湖久久久久久久久免费观看| 各种免费的搞黄视频| 曰老女人黄片| www.av在线官网国产| bbb黄色大片| 欧美日韩国产mv在线观看视频| 免费黄频网站在线观看国产| 亚洲精品日韩在线中文字幕| 久久99热这里只频精品6学生| 丝瓜视频免费看黄片| 国产精品.久久久| 99久久精品国产亚洲精品| 我要看黄色一级片免费的| avwww免费| 麻豆av在线久日| 50天的宝宝边吃奶边哭怎么回事| av在线app专区| 国产免费福利视频在线观看| 天堂8中文在线网| 晚上一个人看的免费电影| 亚洲成色77777| 成人国语在线视频| 午夜两性在线视频| 老司机影院毛片| av又黄又爽大尺度在线免费看| 九色亚洲精品在线播放| 99re6热这里在线精品视频| 亚洲欧洲国产日韩| 777久久人妻少妇嫩草av网站| 1024香蕉在线观看| 七月丁香在线播放| 97精品久久久久久久久久精品| 人人妻人人爽人人添夜夜欢视频| 亚洲欧美成人综合另类久久久| 一本大道久久a久久精品| h视频一区二区三区| 在线观看免费日韩欧美大片| 亚洲成人手机| 少妇被粗大的猛进出69影院| 精品国产一区二区久久| 国产精品一区二区精品视频观看| 大香蕉久久网| 亚洲精品国产一区二区精华液| 一区二区三区激情视频| 90打野战视频偷拍视频| av一本久久久久| 999精品在线视频| 亚洲中文av在线| 视频在线观看一区二区三区| 久久久精品区二区三区| 天堂俺去俺来也www色官网| 国产欧美日韩一区二区三区在线| 精品一区二区三卡| 精品人妻在线不人妻| 深夜精品福利| 黄色一级大片看看| 国产91精品成人一区二区三区 | 亚洲精品乱久久久久久| √禁漫天堂资源中文www| 国产无遮挡羞羞视频在线观看| 午夜福利在线免费观看网站| 丰满人妻熟妇乱又伦精品不卡| 亚洲精品国产区一区二| 亚洲欧美一区二区三区黑人| 看免费av毛片| 无遮挡黄片免费观看| 天天躁夜夜躁狠狠久久av| 精品一区二区三区av网在线观看 | 久久国产亚洲av麻豆专区| 一个人免费看片子| 日韩一本色道免费dvd| 国产成人av激情在线播放| 十八禁高潮呻吟视频| 亚洲av欧美aⅴ国产| 欧美日韩视频高清一区二区三区二| 成人国产一区最新在线观看 | 在线观看www视频免费| 国产精品.久久久| 久久久国产一区二区| 亚洲精品国产区一区二| 精品免费久久久久久久清纯 | 免费观看人在逋| 久久久精品94久久精品| 新久久久久国产一级毛片| 一级片免费观看大全| 久久精品久久精品一区二区三区| 久热这里只有精品99| 日韩伦理黄色片| 亚洲一区中文字幕在线| 看十八女毛片水多多多| 两个人看的免费小视频| 亚洲成人免费av在线播放| 亚洲av国产av综合av卡| 午夜日韩欧美国产| 深夜精品福利| 国产亚洲午夜精品一区二区久久| 乱人伦中国视频| 天天操日日干夜夜撸| 免费在线观看日本一区| 热99久久久久精品小说推荐| 最新在线观看一区二区三区 | 国产精品av久久久久免费| 欧美精品一区二区免费开放| 老汉色av国产亚洲站长工具| 肉色欧美久久久久久久蜜桃| 亚洲av男天堂| 国产老妇伦熟女老妇高清| 欧美97在线视频| 在线看a的网站| 久久午夜综合久久蜜桃| 狠狠精品人妻久久久久久综合| 久久女婷五月综合色啪小说| 欧美精品一区二区免费开放| 菩萨蛮人人尽说江南好唐韦庄| 亚洲自偷自拍图片 自拍| 80岁老熟妇乱子伦牲交| 天天操日日干夜夜撸| 国产成人91sexporn| 美女福利国产在线| 久久久久久免费高清国产稀缺| 无遮挡黄片免费观看| 欧美日韩亚洲综合一区二区三区_| 操美女的视频在线观看| 黄色a级毛片大全视频| 天堂俺去俺来也www色官网| 1024香蕉在线观看| 国产黄频视频在线观看| 国产人伦9x9x在线观看| 久久人人爽人人片av| 久久久精品免费免费高清| 国产熟女欧美一区二区| 亚洲av成人精品一二三区| 国产成人av激情在线播放| 精品国产超薄肉色丝袜足j| 捣出白浆h1v1| 最黄视频免费看| 黑人猛操日本美女一级片| 水蜜桃什么品种好| 免费看不卡的av| 国产成人91sexporn| 久久精品国产综合久久久| 国产野战对白在线观看| 亚洲成色77777| 最黄视频免费看| 亚洲中文日韩欧美视频| 欧美黑人精品巨大| 各种免费的搞黄视频| 国产一级毛片在线| 欧美亚洲 丝袜 人妻 在线| 午夜福利影视在线免费观看| 午夜福利免费观看在线| 久久午夜综合久久蜜桃| 欧美黑人精品巨大| 91麻豆av在线| 在线观看人妻少妇| 国产成人av教育| 一级黄片播放器| 精品国产一区二区三区久久久樱花| 两个人看的免费小视频| 热99久久久久精品小说推荐| 成人国语在线视频| 国产爽快片一区二区三区| 又大又爽又粗| 午夜激情av网站| 人体艺术视频欧美日本| 亚洲成人手机| 女人久久www免费人成看片| 国产高清国产精品国产三级| 国产主播在线观看一区二区 | 亚洲精品中文字幕在线视频| 免费看十八禁软件| 国产精品一区二区精品视频观看| 日本a在线网址| 91精品伊人久久大香线蕉| 免费在线观看完整版高清| 久久久国产一区二区| 人人妻,人人澡人人爽秒播 | 国产无遮挡羞羞视频在线观看| 国产精品久久久久久人妻精品电影 | 别揉我奶头~嗯~啊~动态视频 | 精品一品国产午夜福利视频| 电影成人av| 中文字幕人妻丝袜一区二区| 人人妻人人澡人人爽人人夜夜| 美女主播在线视频| 青青草视频在线视频观看| 老司机靠b影院| 亚洲av美国av| 久久九九热精品免费| 久久青草综合色| 999久久久国产精品视频| 国产一区二区三区综合在线观看| 亚洲国产av影院在线观看| 99久久综合免费| 国产精品麻豆人妻色哟哟久久| 18在线观看网站| 免费日韩欧美在线观看| 黄频高清免费视频| 一级毛片我不卡| 女性生殖器流出的白浆| 久久 成人 亚洲| 一本一本久久a久久精品综合妖精| 熟女少妇亚洲综合色aaa.| 极品少妇高潮喷水抽搐| 少妇人妻久久综合中文| 99久久人妻综合| 日本午夜av视频| 一级黄色大片毛片| 久久天躁狠狠躁夜夜2o2o | 黑丝袜美女国产一区| 亚洲 欧美一区二区三区| 国产不卡av网站在线观看| 亚洲国产欧美在线一区| 午夜影院在线不卡| 18禁裸乳无遮挡动漫免费视频| av线在线观看网站| 国产精品秋霞免费鲁丝片| 欧美成人精品欧美一级黄| 国产成人影院久久av| 老司机在亚洲福利影院| 美女大奶头黄色视频| 国产精品熟女久久久久浪| 国产精品av久久久久免费| 精品少妇内射三级| 国产男女超爽视频在线观看| 丰满少妇做爰视频| 最近中文字幕2019免费版| 男男h啪啪无遮挡| 看免费成人av毛片| 久久99热这里只频精品6学生| 亚洲第一青青草原| 黄片播放在线免费| 精品少妇黑人巨大在线播放| 校园人妻丝袜中文字幕| 亚洲精品美女久久久久99蜜臀 | 黑丝袜美女国产一区| 美女福利国产在线| 夫妻性生交免费视频一级片| 天天躁日日躁夜夜躁夜夜| 久久精品人人爽人人爽视色| 国产av国产精品国产| 老汉色av国产亚洲站长工具| 丁香六月欧美| 波野结衣二区三区在线| 久久精品国产综合久久久| videosex国产| 9191精品国产免费久久| 成人亚洲欧美一区二区av| 热re99久久国产66热| 美女高潮到喷水免费观看| 久久精品久久精品一区二区三区| 岛国毛片在线播放| 丝袜美足系列| 丝袜美腿诱惑在线| 免费在线观看视频国产中文字幕亚洲 | 国产成人精品久久久久久| 亚洲精品国产av成人精品| 大陆偷拍与自拍| netflix在线观看网站| 夫妻性生交免费视频一级片| 夜夜骑夜夜射夜夜干| 99国产综合亚洲精品| 中文字幕高清在线视频| 成在线人永久免费视频| 我要看黄色一级片免费的| 中文字幕人妻丝袜制服| 成年人午夜在线观看视频| 最近最新中文字幕大全免费视频 | 亚洲美女黄色视频免费看| 久久久精品区二区三区| 亚洲一卡2卡3卡4卡5卡精品中文| 成人国产一区最新在线观看 | h视频一区二区三区| 国产精品免费大片| 成人午夜精彩视频在线观看| 国产精品一区二区在线不卡| 一二三四社区在线视频社区8| 久久久久久久精品精品| av有码第一页| 欧美乱码精品一区二区三区| 久久精品国产亚洲av高清一级| 色婷婷久久久亚洲欧美| 色精品久久人妻99蜜桃| 一边摸一边做爽爽视频免费| 香蕉国产在线看| 久久精品久久久久久久性| 香蕉国产在线看| 飞空精品影院首页| 亚洲国产av新网站| 国产成人系列免费观看| 亚洲欧洲日产国产| 亚洲国产精品999| 国产成人av激情在线播放| 一二三四社区在线视频社区8| 一区二区三区四区激情视频| 国产高清不卡午夜福利| 久久精品亚洲熟妇少妇任你| 国产精品 国内视频| 99热全是精品| 亚洲人成电影观看| 欧美 亚洲 国产 日韩一| 国产高清视频在线播放一区 | 一边摸一边抽搐一进一出视频| 搡老岳熟女国产| 99精品久久久久人妻精品| 男人舔女人的私密视频| 亚洲国产av影院在线观看| 国产亚洲av片在线观看秒播厂| 中文字幕色久视频| av欧美777| 欧美精品av麻豆av| 亚洲国产日韩一区二区| 精品国产国语对白av| 99国产综合亚洲精品| 2021少妇久久久久久久久久久| 亚洲一码二码三码区别大吗| 亚洲精品成人av观看孕妇| 成人手机av| 亚洲国产成人一精品久久久| 国产成人免费观看mmmm| av欧美777| 久久影院123| 捣出白浆h1v1| 一级毛片我不卡| 最近最新中文字幕大全免费视频 | 69精品国产乱码久久久| av线在线观看网站| 一级片'在线观看视频| 另类亚洲欧美激情| av国产精品久久久久影院| 五月开心婷婷网| 国语对白做爰xxxⅹ性视频网站| 亚洲欧美精品自产自拍| 一级黄色大片毛片| 99香蕉大伊视频| 韩国精品一区二区三区| 中文精品一卡2卡3卡4更新| 两个人免费观看高清视频| 亚洲av电影在线观看一区二区三区| 捣出白浆h1v1| 一本综合久久免费| 男的添女的下面高潮视频| 午夜两性在线视频| 国产国语露脸激情在线看| 日本91视频免费播放| 熟女少妇亚洲综合色aaa.| 国产不卡av网站在线观看| 一级毛片女人18水好多 | 欧美精品亚洲一区二区| 亚洲色图 男人天堂 中文字幕| 男人舔女人的私密视频| 777米奇影视久久| 国产精品久久久人人做人人爽| 人人妻人人添人人爽欧美一区卜| 色94色欧美一区二区| 热99国产精品久久久久久7| 亚洲一区二区三区欧美精品| 黄色 视频免费看| 少妇人妻久久综合中文| 在线观看免费视频网站a站| 亚洲av电影在线观看一区二区三区| 19禁男女啪啪无遮挡网站| 精品久久久精品久久久| 男女边摸边吃奶| 欧美日韩黄片免| 日本vs欧美在线观看视频| 国产伦人伦偷精品视频| 欧美日韩精品网址| 成年人黄色毛片网站| 久久精品国产a三级三级三级| av又黄又爽大尺度在线免费看| 国产不卡av网站在线观看| 亚洲,一卡二卡三卡| 免费观看人在逋| 97精品久久久久久久久久精品| 国产熟女午夜一区二区三区| 亚洲少妇的诱惑av| 美女扒开内裤让男人捅视频| 亚洲久久久国产精品| 亚洲专区国产一区二区| 日本午夜av视频| 一个人免费看片子| 中文字幕最新亚洲高清| 女性生殖器流出的白浆| 成年人午夜在线观看视频| 91国产中文字幕| 韩国精品一区二区三区| 亚洲国产中文字幕在线视频| 十八禁网站网址无遮挡| 欧美人与性动交α欧美软件| 亚洲人成电影观看| 亚洲伊人色综图| 午夜久久久在线观看| 亚洲,欧美,日韩| 亚洲国产最新在线播放| 亚洲精品自拍成人| 自线自在国产av| 熟女少妇亚洲综合色aaa.| 中文字幕人妻丝袜一区二区| 少妇猛男粗大的猛烈进出视频| 满18在线观看网站| 精品国产超薄肉色丝袜足j| 国产成人91sexporn| 人妻 亚洲 视频| 丁香六月欧美| 99国产精品一区二区蜜桃av | 18禁裸乳无遮挡动漫免费视频| 国产又色又爽无遮挡免| 精品高清国产在线一区| 老鸭窝网址在线观看| 国产高清视频在线播放一区 | 777久久人妻少妇嫩草av网站| 日本午夜av视频| 亚洲第一av免费看| 50天的宝宝边吃奶边哭怎么回事| 国产亚洲精品久久久久5区| 日韩中文字幕欧美一区二区 | 日韩欧美一区视频在线观看| 日日夜夜操网爽| 久久99一区二区三区| 亚洲精品中文字幕在线视频| 精品久久蜜臀av无| 国产欧美日韩精品亚洲av| 超碰成人久久| 国产精品一国产av| 天天影视国产精品| 精品少妇久久久久久888优播| 51午夜福利影视在线观看| 国产无遮挡羞羞视频在线观看| 国产精品亚洲av一区麻豆| 18禁国产床啪视频网站| 丰满少妇做爰视频| 一区二区三区精品91| 国产野战对白在线观看| 国产一区二区 视频在线| 香蕉丝袜av| 久久精品久久久久久久性| 首页视频小说图片口味搜索 | 一本久久精品| 久久99精品国语久久久| videos熟女内射| 欧美久久黑人一区二区| 久久人妻熟女aⅴ| 婷婷成人精品国产| 七月丁香在线播放| 亚洲男人天堂网一区| 亚洲国产最新在线播放| 国产精品偷伦视频观看了| 丝袜喷水一区| 看免费av毛片| 视频区欧美日本亚洲| 亚洲av电影在线观看一区二区三区| 男女无遮挡免费网站观看| 国产又色又爽无遮挡免| 丝袜美腿诱惑在线| 中文字幕精品免费在线观看视频| 免费观看av网站的网址| 婷婷色综合大香蕉| a级毛片黄视频| 国产色视频综合| 极品少妇高潮喷水抽搐| 精品久久蜜臀av无| 亚洲精品一二三| 亚洲av电影在线观看一区二区三区| 久热这里只有精品99| 在线观看国产h片| 国产精品二区激情视频| 免费一级毛片在线播放高清视频 | 久久久精品免费免费高清| 亚洲av片天天在线观看| 只有这里有精品99| 天天影视国产精品| 啦啦啦视频在线资源免费观看| 男女边摸边吃奶| 91成人精品电影| 国产成人av激情在线播放| 高清黄色对白视频在线免费看| 狠狠精品人妻久久久久久综合| 男女国产视频网站| 精品国产乱码久久久久久男人| 999精品在线视频| www.精华液| 成在线人永久免费视频| 亚洲精品国产区一区二| 亚洲国产中文字幕在线视频| 天天添夜夜摸| 免费看av在线观看网站| 免费久久久久久久精品成人欧美视频| 夫妻性生交免费视频一级片| 亚洲av综合色区一区| 日本猛色少妇xxxxx猛交久久| 视频区欧美日本亚洲| 日韩一区二区三区影片| 精品久久蜜臀av无| 五月开心婷婷网| 国产精品国产三级国产专区5o| 成人国语在线视频| 国产一区二区激情短视频 | 亚洲,欧美,日韩| 曰老女人黄片| 亚洲av男天堂| 黄色视频不卡| 97精品久久久久久久久久精品| 一本色道久久久久久精品综合| 成年人免费黄色播放视频| 97精品久久久久久久久久精品| 亚洲精品乱久久久久久| 亚洲中文av在线| 天堂俺去俺来也www色官网| 久久久国产一区二区| 黑丝袜美女国产一区| 国产一级毛片在线| 欧美少妇被猛烈插入视频| 精品国产乱码久久久久久男人| 在线亚洲精品国产二区图片欧美| 欧美大码av| 欧美精品一区二区大全|