Vague Entropy Measure for Complex Vague Soft Sets - MDPI

3 downloads 0 Views 6MB Size Report
May 24, 2018 - day-to-day life, uncertainty plays a dominant role in any decision-making process. .... some distance and entropy measures for CIF soft sets.
entropy Article

Vague Entropy Measure for Complex Vague Soft Sets Ganeshsree Selvachandran 1 1

2 3

*

ID

, Harish Garg 2, *

ID

and Shio Gai Quek 3

Department of Actuarial Science and Applied Statistics, Faculty of Business & Information Science, UCSI University, Jalan Menara Gading, Cheras 56000, Kuala Lumpur, Malaysia; [email protected] School of Mathematics, Thapar Institute of Engineering & Technology (Deemed University), Patiala 147004, Punjab, India A-Level Academy, UCSI College KL Campus, Lot 12734, Jalan Choo Lip Kung, Taman Taynton View, Cheras 56000, Kuala Lumpur, Malaysia; [email protected] Correspondence: [email protected]; Tel.: +91-86-9903-1147

Received: 12 March 2018; Accepted: 17 May 2018; Published: 24 May 2018

 

Abstract: The complex vague soft set (CVSS) model is a hybrid of complex fuzzy sets and soft sets that have the ability to accurately represent and model two-dimensional information for real-life phenomena that are periodic in nature. In the existing studies of fuzzy and its extensions, the uncertainties which are present in the data are handled with the help of membership degree which is the subset of real numbers. However, in the present work, this condition has been relaxed with the degrees whose ranges are a subset of the complex subset with unit disc and hence handle the information in a better way. Under this environment, we developed some entropy measures of the CVSS model induced by the axiomatic definition of distance measure. Some desirable relations between them are also investigated. A numerical example related to detection of an image by the robot is given to illustrate the proposed entropy measure. Keywords: vague entropy; distance induced vague entropy; distance; complex fuzzy set; complex vague soft set

1. Introduction Classical information measures deal with information which is precise in nature, while information theory is one of the trusted ways to measure the degree of uncertainty in data. In our day-to-day life, uncertainty plays a dominant role in any decision-making process. In other words, due to an increase of the system day-by-day, decision makers may have to give their judgments in an imprecise, vague and uncertain environment. To deal with such information, Zadeh [1] introduced the theory of fuzzy sets (FSs) for handling the uncertainties in the data by defining a membership function with values between 0 and 1. In this environment, Deluca and Termini [2] proposed a set of axioms for fuzzy entropy. Liu [3] and Fan and Xie [4] both studied information measures related to entropy, distance, and similarity for fuzzy sets. With the growing complexities, researchers are engaged in extensions such as intuitionistic fuzzy set (IFS) [5], vague set (VS) [6], interval-valued IFS [7] to deal with the uncertainties. Under these extensions, Szmidt and Kacprzyk [8] extended the axioms of Deluca and Termini [2] to the IFS environment. Later on, corresponding to Deluca and Termini’s [2] fuzzy entropy measure, Vlachos and Sergiadis [9] extended their measure in the IFS environment. Burillo and Bustince [10] introduced the entropy of intuitionistic fuzzy sets (IFSs), as a tool to measure the degree of intuitionism associated with an IFS. Garg et al. [11] presented a generalized intuitionistic fuzzy entropy measure of order α and degree β to solve decision-making problems. In addition to the mentioned examples, other authors have also addressed the problem of decision-making by using the different information measures [12–25]. Entropy 2018, 20, 403; doi:10.3390/e20060403

www.mdpi.com/journal/entropy

Entropy 2018, 20, 403

2 of 19

All the above-defined work is successfully applied to the various disciplines without considering the parameterization factor during the analysis. Therefore, under some certain cases, these existing theories may be unable to classify the object. To cope with such situations, many researchers are paying more attention to soft set (SS) theory [26]. After its discovery, researchers are engaged in its extensions. For instance, Maji et al. [27,28] combined the theory of SSs with FSs and IFSs and came up with a new concept of the fuzzy soft set (FSS) and intuitionistic fuzzy soft set (IFSS). Further, the concept of the hybridization of the SSs with the others, such as generalized fuzzy soft set [29,30], generalized intuitionistic fuzzy soft set [31,32], distance measures [33–36], and fuzzy number intuitionistic fuzzy soft sets [37] plays a dominant role during the decision making process. IFSS plays a dominant role in handling the uncertainties in the data by incorporating the idea of the expert as well as the parametric factors. In that environment, Arora and Garg [38,39] presented some aggregation operators for intuitionistic fuzzy soft numbers. Garg and Arora [40] presented some non-linear methodology for solving decision-making problems in an IFSS environment. In terms of the information measures, Garg and Arora [34] developed various distance and similarity measures for dual hesitant FSS. Recently, Garg and Arora [41] presented Bonferroni mean aggregation operators for an IFSS environment. Apart from these, vague soft set [42] is an alternative theory which is the hybridization of the vague set [6] and soft set [26]. In this field, Chen [43] developed some similarity measures for vague sets. Wang and Qu [44] developed some entropy, similarity and distance measures for vague sets. Selvachandran et al. [45] introduced distance induced entropy measures for generalized intuitionistic fuzzy soft sets. The above theories using FSs, IFSs, IFSSs, VSs, FSSs are widely employed by researchers but they are able to handle only the uncertainty in the data. On the other hand, none of these models will be able to handle the fluctuations of the data at a given phase of time during their execution, but in today’s life, the uncertainty and vagueness of the data changes periodicity with the passage of time and hence the existing theories are unable to consider this information. To overcome this deficiency, Ramot et al. [46] presented a complex fuzzy set (CFS) in which the range of membership function is extended from real numbers to complex numbers with the unit disc. Ramot et al. [47] generalized traditional fuzzy logic to complex fuzzy logic in which the sets used in the reasoning process are complex fuzzy sets, characterized by complex valued membership functions. Later on, Greenfield et al. [48] extended the concept of CFS by taking the grade of the membership function as an interval-number rather than single numbers. Yazdanbakhsh and Dick [49] conducted a systematic review of CFSs and logic and discussed their applications. Later on, Alkouri and Salleh [50] extended the concepts of CFS to complex intuitionistic fuzzy (CIF) sets (CIFSs) by adding the degree of complex non-membership functions and studied their basic operations. Alkouri and Salleh [51] introduced the concepts of CIF relation, composition, projections and proposed a distance measure between the two CIFSs. Rani and Garg [52] presented some series of distance measures for a CIFS environment. Kumar and Bajaj [53] proposed some distance and entropy measures for CIF soft sets. In these theories, a two-dimensional information (amplitude and phase terms) are represented as a single set. The function of the phase term is to model the periodicity and/seasonality of the elements. For instance, when dealing with an economics-related situation, the phase term represents the time taken for the change in an economic variable to impact the economy. On the other hand, in robotics, the phase term can represent direction, whereas in image processing, the phase term can represent the non-physical attributes of the image. As an alternative to these theories, the concept of the complex vague soft set (CVSS) [54] handles the two-dimensional information by combining the properties of CFSs [46], soft sets [26] and vague sets [6]. The CVSSs differs from the existing sets with the features that they contain: (1) an interval-based membership structure that provides users with the means of recording their hesitancy in the process of assigning membership values for the elements; (2) the ability to handle the partial ignorance of the data; (3) adequate parameterization abilities that allow for a more comprehensive representation of the parameters. Selvachandran et al. [54,55] investigated complex vague soft sets

Entropy 2018, 20, 403

3 of 19

(CVSSs). Selvachandran et al. [56] presented similarity measures for CVSSs and their applications to pattern recognition problems. Thus, motivated from the concept of CVSS, the focus of this work is to explore the structural characteristics of CVSSs and to present some information measures for handling the uncertainties in the data. Per our knowledge, in the aforementioned studies, the information measures cannot be utilized to handle the CVSS information. Thus, in order to achieve this, we develop the axiomatic definition of the distance and entropy measures between CVSSs and hence propose some new entropy measures. Some of the algebraic properties of these measures and the relations between them are also verified. The proposed measures have the following characteristics: (1) they serve as a complement to the CVSS model and its relations in representing and modeling time-periodic phenomena; (2) they have elegant properties that increase their reach and applicability; (3) they have important applications in many real-world problems in the areas of image detection, pattern recognition, image processing; (4) they add to the existing collection of methodologies and techniques in artificial intelligence and soft computing, where it is often necessary to determine the degree of vagueness of the data, in order to make optimal decisions. This provides support of the increasingly widespread trend in the use of mathematical tools to complement scientific theories and existing procedures, in the handling and solving of real-life problems that involve vague, unreliable and uncertain two-dimensional information. Furthermore, an effort has been put forth to solve the classification problem in multi-dimensional complex data sets. To elaborate the proposed method, we will be focusing on the representation and recognition of digital images defined by multi-dimensional complex data sets using the properties of CVSSs and a new distance and entropy measure for this model. The rest of the manuscript is organized as follows: in Section 2, we briefly review the basic concepts of SSs and CVSSs. In Section 3, we define the axiomatic definition of the distance and entropy measures for CVSSs. In Section 4, some basic relationships between the distance and entropy measures are defined. In Section 5, the utility of the CVSS model and its entropy measure is illustrated by applying it in a classification of the digital image with multi-dimensional data. Finally, conclusions and future work are stated in Section 6. 2. Preliminaries In this section, we briefly reviewed some basic concepts related to the VSs, SSs, CVSSs defined over the universal set U. Definition 1 [6]. A vague set (VS) V in U is characterized by the truth and falsity membership functions tV , fV : U →[0.1] with tV (x) + fV (x) ≤ 1 for any x ∈ U. The values assigned corresponding to tV (x) and fV (x) are the real numbers of [0, 1]. The grade of membership for x can be located in [tV (x), 1 − fV (x)] and the uncertainty of x is defined as (1 − fV (x)) − tV (x). It is clearly seen from the definition that VSs are the generalization of the fuzzy sets. If we assign 1 − fV (x) to be 1 − tV (x) then VS reduces to FS. However, if we set 1 − tV (x) to be νA (x) (called L (x) the non-membership degree) then VS reduces to IFS. On the other hand, if we set tV ( x ) = µV U and 1 − f V ( x ) = µV ( x ) then VS reduces to interval-valued FS. Thus, we conclude that VSs are the generalization of the FSs, IFSs and interval-valued FSs. Definition 2 [6]. Let A = {< x, [t A (x), 1 − f A (x)] >: x ∈ U } and B = {< x, [tB (x), 1 − f B (x)] >: x ∈ U } be two VSs defined on U then the basic operational laws between them are defined as follows: (i) (ii) (iii) (iv)

A ⊆ B if tA (x) ≤ tB (x) and 1 − fA (x) ≤ 1 − fB (x) for all x. Complement: AC = {: x ∈ U}. Union: A∪B = {: x ∈ U} Intersection: A∩B = {: x ∈ U}

Entropy 2018, 20, 403

4 of 19

Definition 3 [26]. Let P(U ) denote the power set of U. A pair ( F, A) is called a soft set (SS) over V where F is a mapping given by F : A → P(U ) . ˆ A) is called a vague soft set (VSS) over Definition 4 [42]. Let V (U ) be the power set of VSs over U. A pair ( F, U, where Fˆ is a mapping given by Fˆ : A → V (U ). Mathematically, VSS can be defined as follows: i o n h ˆ A) = h x, t ˆ ( x ), 1 − f ˆ ( x ) i : x ∈ U, e ∈ A ( F, F F (e)

(e)

It is clearly seen that this set is the hybridization of the SSs and VSs. Definition 5 [57]. A complex vague set (CVS) is defined as an ordered pair defined as A = {h x, [t A ( x ), 1 − f A ( x )]i : x ∈ U } where t A : U → { a : a ∈ C, | a| ≤ 1}, f A : U → { a : a ∈ X, | a| ≤ 1} are the truth and falsity membership

=

functions with unit disc and are defined as t A ( x )   i(2π −wk ( x)) √ fA 1 − k f A ( x ) .e where i = −1.

rt A ( x ).e

iwrt ( x ) A

and 1 − f A ( x )

=

Definition 6 [54]. Let P(U ) denote the complex vague power set of U and E be the set of parameters. For any A ⊂ E, a pair ( F, A) is called a complex vague soft set (CVSS) over U, where F : A → P(U ) , defined as: 









x j , rt Fa x j , 1 − k t Fa x j . e

F xj =

i [wrt ( x j ), 2π −wkf ( x j )] Fa



 : xj ∈ U

Fa

  where j = 1, 2, 3, . . . is the number of parameters, rt Fa ( x ), 1 − k t Fa ( x ) are real-valued ∈ [0, 1], the phase h i terms wr t Fa ( x ), 2π − wk f Fa ( x ) are real-valued in the interval (0, 2π ], 0 ≤ rt Fa ( x ) + k Fa ( x ) ≤ 1 and √ i = −1. The major advantages of the CVSS are that it represents two-dimensional information in a single set and each object is characterized in terms of its magnitude as well as its phase term. Further, the soft set component in CVSS provides an adequate parameterization tool to represent the information. Definition 7 [54]. Let two CVSSs ( F, A) and ( G, B) over U, the basic operations between them are defined as (i)

(ii)

( F, A) ⊂ ( G, B) if and only if the following conditions are satisfied for all x ∈ U : (a)

rt Fa ( x ) ≤ rtG ( x ) and k f G ( x ) ≤ k f Fa ( x );

(b)

wrtF ( x ) ≤ wrtG ( x ) and wkf ( x ) ≤ wkf ( x ).

b

a

b

b

Gb

Fa

Null CVSS: ( F, A) = φ if rt Fa ( x ) = 0, k f Fa ( x ) = 1 and wrtF ( x ) = 0π, wkf ( x ) = 2π for all x ∈ U. a

Fa

(iii) Absolute CVSS: ( F, A) = 1 if rt Fa ( x ) = 1, k f Fa ( x ) = 0 and wrtF ( x ) = 2π, wkf ( x ) = 0π for all Fa a x ∈ U. 3. Axiomatic Definition of Distance Measure and Vague Entropy Let E be a set of parameters and U be the universe of discourse. In this section, we present some information measures namely distance and entropy for the collections of CVSSs, which are denoted by CVSS(U). Definition 8. Let (F, A), (G, B), (H, C) ∈ CVSS(U). A complex-value function d : CVSS(U) × CVSS(U) → { a, a ∈ U, | a| ≤ 1} is called a distance measure between CVSSs if it satisfies the following axioms: (D1) d(( F, A), ( G, B)) = d(( G, B), ( F, A))

Entropy 2018, 20, 403

5 of 19

(D2) d(( F, A), ( G, B)) = 0 ⇐⇒ ( F, A) = ( G, B) (D3) d(( F, A), ( G, B)) = 1 ⇐⇒ ∀ e ∈ E, x ∈ U, both ( F, A) and ( G, B) are crisp sets in U, i.e., n o n o x, [0, 0]ei[0π, 0π ] and ( G, B) = x, [1, 1]ei[2π, 2π ] , ( F, A) = n o n o or ( F, A) = x, [0, 0]ei[2π, 2π ] and ( G, B) = x, [1, 1]ei[0π, 0π ] , n o n o or ( F, A) = x, [1, 1]ei[2π, 2π ] and ( G, B) = x, [0, 0]ei[0π, 0π ] , n o n o or ( F, A) = x, [1, 1]ei[0π, 0π ] and ( G, B) = x, [0, 0]ei[2π, 2π ] . (D4) If (F, A) ⊆ (G, B) ⊆ ( H, C), then d((F, A), ( H, C)) ≥ max(d((F, A), (G, B)), d((G, B), ( H, C))). Next, we define the axiomatic definition for the vague entropy for a CVSS. Definition 9. A complex-valued function M : CVSS(U ) → { a : a ∈ C, | a| ≤ 1} is called vague entropy of CVSSs, if it satisfies the following axioms for any ( F, A), ( G, B) ∈ CVSS(U ). (M1) 0 ≤ | M( F, A)| ≤ 1. (M2) M ( F, A) = 0 ⇐⇒ ( F, A) is a crisp set on U for all a ∈ A and x ∈ U, i.e., rt Fa ( x ) = 1, k f Fa ( x ) = 0 and wrtF ( x ) = 2π, wkf ( x ) = 0π or rt Fa ( x ) = 1, k f Fa ( x ) = 0 and wrtF ( x ) = 0π, wkf ( x ) = 2π Fa

a

Fa

a

or rt Fa ( x ) = 0, k f Fa ( x ) = 1 and wrtF ( x ) = 0π, wkf ( x ) = 2π or rt Fa ( x ) = 0, k f Fa ( x ) = 1 and Fa

a

wrtF ( x ) = 2π, wkf ( x ) = 0π. Fa

a

(M3) M ( F, A) = 1 ⇐⇒ ∀ a ∈ A and x ∈ U, ( F, A) is completely vague i.e., rt Fa ( x ) = k f Fa ( x ) and wrtF ( x ) = wkf ( x ). Fa a  (M4) M( F, A) = M ( F, A)c (M5) If the following two cases holds for all a ∈ A and x ∈ U, Case 1 : rt Fa ( x ) ≤ rtG ( x ), k f Fa ( x ) ≥ k f G ( x )whenever rtG ( x ) ≤ k f G ( x ); b

b

b

b

and wrtFa ( x ) ≤ wrtG ( x ), wkfF ( x ) ≥ wkfG ( x ) whenever wrtG ( x ) ≤ wkfG ( x ); a

b

b

b

b

Case 2 : rt Fa ( x ) ≥ rtG ( x ), k f Fa ( x ) ≤ k f G ( x )whenever rtG ( x ) ≥ k f G ( x ) b

and

wrtFa ( x )



wrtG

( x ),wkfF ( x ) a b

b

b



wkfG

b

( x )whenever

wrtG

b

(x) ≥

b

wkfG

b

( x );

then M( F, A) ≤ M ( G, B). Based on this definition, it is clear that a value close to 0 indicates that the CVSS has a very low degree of vagueness whereas a value close to the 1 implies that the CVSS is highly vague. For all x ∈ U, the nearer rt Fa ( x ) is to k f Fa ( x ), the larger the vague entropy measure and it reaches a maximum when rt Fa ( x ) = k f Fa ( x ). Condition M5 on the other hand, is slightly different as it is constructed using the sharpened version of a vague soft set as explained in Hu et al. [58], instead of the usual condition of ( F, A) ⊂ ( G, B) implies that the entropy of ( F, A) is higher than the entropy of ( G, B). In [58], Hu et al. proved that this condition is inaccurate and provided several counter-examples to disprove this condition. Subsequently, they replaced this flawed condition with two new cases. We generalized these two cases to derive condition (M5) in this paper, in a bid to increase the accuracy of our proposed vague entropy. We refer the readers to [58] for further information on these revised conditions. 4. Relations between the Proposed Distance Measure and Vague Entropy In the following, let U be universal and φ be empty over CVSSs. Then based on the above definition, we define some of the relationship between them as follows:

Entropy 2018, 20, 403

6 of 19

Theorem 1. Let ( F, A) be CVSS and d is the distance measure between CVSSs, then the equations M1 , M2 and M3 defined as below  (i) M1 ( F, A) = 1 − d ( F, A), ( F, A)c (ii) M2 ( F, A) = d(( F, A) ∪ ( F, A)c , U ) (iii) M3 ( F, A) = 1 − d(( F, A) ∪ ( F, A)c , ( F, A) ∩ ( F, A)c ) are the valid vague entropies of CVSSs. Proof. Here, we shall prove only the part (i), while others can be proved similarly. It is clearly seen from the definition of vague entropies that M1 satisfies conditions (M1) to (M4). So we need to prove only (M5). For it, consider the two cases stated in Definition 9. We only prove that the condition (M5) is satisfied for Case 1; the proof for Case 2 is similar and is thus omitted. From the conditions given in Case 1 of (M5), we obtain the following relationship: rt Fa ( x ) ≤ rtG ( x ) ≤ k f G ( x ) ≤ k f Fa ( x ) b

b

and wrtFa ( x ) ≤ wrtG ( x ) ≤ wkfG ( x ) ≤ wkfF ( x ). b

b

a

Therefore, we have: φ ⊂ ( F, A) ⊂ ( G, B) ⊂ ( G, B)c ⊂ ( F, A)c ⊂ U. Hence, it follows that:

(( F, A), ( F, A)c ) ≥ d(( G, B), ( G, B)c ). Now, by definition of M1 , we have: M1 ( F, A) = 1 − d(( F, A), ( F, A)c )

≤ 1 − d(( G, B), ( G, B)c ) = M1 ( G, B). This completes the proof. Theorem 2. If d is the distance measure between CVSSs, then: M4 ( F, A) =

d(( F, A) ∪ ( F, A)c , U ) d(( F, A) ∩ ( F, A)c , U )

is a vague entropy of CVSSs. Proof. For two CVSSs ( F, A) & ( G, B), clearly seen that M4 satisfies conditions (M1)–(M4). So, it is enough to prove that M4 satisfy the condition (M5). Consider the case: rt Fa ( x ) ≤ rtG ( x ), k f Fa ( x ) ≥ k f G ( x ) whenever rtG ( x ) ≤ k f G ( x ) b

b

b

b

and wrtFa ( x ) ≤ wrtG ( x ), wkfF ( x ) ≥ wkfG ( x ) whenever wrtG ( x ) ≤ wkfG ( x ) b

a

b

b

which implies that: rt Fa ( x ) ≤ rtG ( x ) ≤ k f G ( x ) ≤ k f Fa ( x ) b

b

b

Entropy 2018, 20, 403

7 of 19

and wrtFa ( x ) ≤ wrtG ( x ) ≤ wkfG ( x ) ≤ wkfF ( x ). b

a

b

Thus, we obtain: φ ⊂ ( F, A) ∩ ( F, A)c ⊂ ( G, B) ∩ ( G, B)c ⊂ ( G, B) ∪ ( G, B)c ⊂ ( F, A) ∪ ( F, A)c ⊂ U. Therefore, we have: d(( F, A) ∪ ( F, A)c , U ) ≤ d(( G, B) ∪ ( G, B)c U ) and d(( F, A) ∩ ( F, A)c , U ) ≤ d(( G, B) ∩ ( G, B)c U ). Hence, by definition of M4 , we have: d(( F, A) ∪ ( F, A)c , U ) d(( F, A) ∩ ( F, A)c , U )

M4 ( F, A) =



d(( G, B) ∪ ( G, B)c U ) d(( G, B) ∩ ( G, B)c U )

= M4 ( G, B) Similarly, we can obtain for other case i.e., when rt Fa ( x ) ≥ rtG ( x ), k f Fa ( x ) ≤ k f G ( x ) whenever b

b

rtG ( x ) ≥ k f G ( x ) and wrtF ( x ) ≥ wrtG ( x ), wkf ( x ) ≤ wkf ( x ) whenever wrtG ( x ) ≥ wkf ( x ), we have b

b

a

Fa

b

M4 ( F, A) ≤ M4 ( G, B). Hence (M5) satisfied. Therefore, M4 is a valid entropy measure.

Gb

b

Gb

Theorem 3. For CVSS ( F, A) and if d is the distance measure between CVSSs, then: M5 ( F, A) =

d(( F, A) ∩ ( F, A)c , φ) d(( F, A) ∪ ( F, A)c , φ)

is a vague entropy of CVSSs. Proof. It can be obtained as similar to Theorem 2, so we omit here. Theorem 4. For two CVSSs ( F, A) and ( G, B). If d is a distance measure between CVSSs such that: d(( F, A), ( G, B)) = d(( F, A)c , ( G, B)c ), then the entropies M4 and M5 satisfies the equation M4 = M5 . Proof. By definition of M4 and M5 , we have: M4 ( F, A) =

d(( F, A) ∪ ( F, A)c , U ) d(( F, A) ∩ ( F, A)c , U ) c

=

d((( F, A) ∪ ( F, A)c ) , U c ) c d((( F, A) ∩ ( F, A)c ) , U c )

=

d(( F, A) ∩ ( F, A)c , φ) d(( F, A) ∪ ( F, A)c , φ)

= M5 ( F, A).

Entropy 2018, 20, 403

8 of 19

Theorem 5. For a CVSS ( F, A), if d is the distance measure between CVSSs and satisfies: d(( F, A), U ) = d(( F, A), φ), Then: M6 ( F, A) =

d(( F, A) ∪ ( F, A)c , U ) d(( F, A) ∪ ( F, A)c , φ)

is a vague entropy of CVSSs. Theorem 6. If d is the distance measure between CVSSs and satisfies d(( F, A), U ) = d(( F, A), φ), then: M7 ( F, A) =

d(( F, A) ∩ ( F, A)c , φ) d(( F, A) ∩ ( F, A)c , U )

is a vague entropy of CVSSs. Theorem 7. If d is a distance measure between CVSSs that satisfies: d(( F, A), ( G, B)) = d(( F, A)c , ( G, B)c ), then M6 = M7 . Proof. The proof of the Theorems 5–7 can be obtained as similar to above, so we omit here. Theorem 8. If d is a distance measure between CVSSs, then: M8 ( F, A) = 1 − d(( F, A) ∩ ( F, A)c , U ) + d(( F, A) ∪ ( F, A)c , U ) is a vague entropy of CVSSs. Theorem 9. If d is a distance measure between CVSSs, then: M9 ( F, A) = 1 − d(( F, A) ∪ ( F, A)c , φ) + d(( F, A) ∩ ( F, A)c , φ) is a vague entropy of CVSSs. Theorem 10. If d is a distance measure between CVSSs (F, A) and (G, B) such that: d(( F, A), ( G, B)) = d(( F, A)c , ( G, B)c ), then M8 = M9 . Theorem 11. If d is a distance measure between CVSSs, then: M10 ( F, A) = 1 − d(( F, A) ∪ ( F, A)c , φ) + d(( F, A) ∪ ( F, A)c , U ) is a vague entropy of CVSSs. Theorem 12. If d is a distance measure between CVSSs, then: M11 ( F, A) = 1 − d(( F, A) ∩ ( F, A)c , U ) + d(( F, A) ∩ ( F, A)c , φ) is a vague entropy of CVSSs.

Entropy 2018, 20, 403

9 of 19

Theorem 13. If d is a distance measure between CVSSs (F,A) and (G,B) such that: d(( F, A), ( G, B)) = d(( F, A)c , ( G, B)c ), then M10 = M11 . Proof. The proof of these Theorems can be obtained as similar to above, so we omit here. 5. Illustrative Example In this section, we present a scenario which necessitates the use of CVSSs. Subsequently, we present an application of the entropy measures proposed in Section 4 to an image detection problem to illustrate the validity and effectiveness of our proposed entropy formula. Firstly, we shall define the distance between any two CVSSs as follows: Definition 10. Let ( F, A) and ( G, B) be two CVSSs over U. The distance between ( F, A) and ( G, B) is as given below:   max rt F ( x j ) − rtG ( x j ) , k f G ( x j ) − k f F ( x j ) 1 n m ( ai ) ( bi ) ( ai ) ( bi )   d(( F, A), ( G, B)) = ∑  1 r k 4mn j∑ k r =1 i =1 + 2π (max wt F ( x j ) − wtG ( x j ) , w f ( x j ) − w f ( x j ) ) 

( ai )

( bi )

G( b ) i

   

F( a ) i

In order to demonstrate the utility of the above proposed entropy measures Mi (i = 1, 2, . . . , 11), we demonstrate it with a numerical example. For it, consider a CVSS ( F, A) whose data sets are defined over the parameters e1 , e2 ∈ E and x1 , x2 , x3 ∈ U as follows: x ( F, A) = 1 x2

"

[0.2, 0.8]ei[0.1(2π ), 0.2(2π )] [0.3, 0.6]ei[0.4(2π ), 0.5(2π )]

[0.3, 0.5]ei[0.2(2π ), 0.4(2π )] [0.2, 0.3]ei[0.2(2π ), 0.4(2π )]

[0.3, 0.6]ei[0.4(2π ), 0.5(2π )] [0.7, 0.9]ei[0.4(2π ), 0.5(2π )]

#

[0.5, 0.7]ei[0.6(2π ), 0.8(2π )] [0.7, 0.8]ei[0.6(2π ), 0.8(2π )]

[0.4, 0.7]ei[0.5(2π ), 0.6(2π )] [0.1, 0.3]ei[0.5(2π ), 0.6(2π )]

#

and hence the complement of CVSS is: x ( F, A) = 1 x2

"

c

[0.2, 0.8]ei[0.8(2π ), 0.9(2π )] [0.2, 0.5]ei[0.8(2π ), 0.9(2π )]

Then, the distance measure based on the Definition 10, we get (( F, A), ( F, A)c ) = 0.1708, d(( F, A) ∪ ( F, A)c , U ) = 0.2167, d(( F, A) ∪ ( F, A)c , ( F, A) ∩ ( F, A)c ) = 0.1708, d(( F, A) ∩ ( F, A)c , U ) = 0.3875, d(( F, A) ∪ ( F, A)c , φ) = 0.3875, d(( F, A) ∩ ( F, A)c , φ) = 0.2167. Therefore, the values of the entropy measures defined on the Theorem 1 to Theorem 11 are computed as: (i) M1 ( F, A) = 1 − d(( F, A), ( F, A)c ) = 1 − 0.1708 = 0.8292. (ii) M2 ( F, A) = d(( F, A) ∪ ( F, A)c , U ) = 0.2167. (iii) M3 ( F, A) = 1 − d(( F, A) ∪ ( F, A)c , ( F, A) ∩ ( F, A)c ) = 1 − 0.1708 = 0.8292. (iv) M4 ( F, A) = (v)

M5 ( F, A) =

(vi) M6 ( F, A) = (vii) M7 ( F, A) =

d(( F, A)∪( F, A)c , U ) d(( F, A)∩( F, A)c , U ) d(( F, A)∩( F, A)c , φ) d(( F, A)∪( F, A)c , φ) d(( F, A)∪( F, A)c , U ) d(( F, A)∪( F, A)c , φ) d(( F, A)∩( F, A)c , φ) d(( F, A)∩( F, A)c , U )

=

0.2167 0.3875

= 0.5592.

=

0.2167 0.3875

= 0.5592.

=

0.2167 0.3875

= 0.5592.

=

0.2167 0.3875 c

= 0.5592.

(viii) M8 ( F, A) = 1 − d(( F, A) ∩ ( F, A) , U ) + d(( F, A) ∪ ( F, A)c , U ) = 1 − 0.3875 + 0.2167 = 0.5592 (ix) M9 ( F, A) = 1 − d(( F, A) ∪ ( F, A)c , φ) + d(( F, A) ∩ ( F, A)c , φ) = 1 − 0.3875 + 0.2167 = 0.5592 (x) M10 ( F, A) = 1 − d(( F, A) ∪ ( F, A)c , φ) + d(( F, A) ∪ ( F, A)c , U ) = 1 − 0.3875 + 0.2167 = 0.5592

(v)

( , )=

(vi)

( , )=

(vii)

( , )=

( , )∩( , ) , ( , )∪( , ) , ( , )∪( , ) , ( , )∪( , ) , ( , )∩( , ) , ( , )∩( , ) ,

= = =

. . . . . .

= 0.5592. = 0.5592. = 0.5592.

Entropy ( ,403) = 1 − (viii)2018, 20,

10 of 19 ( , )∩( , ) , + ( , )∪( , ) , = 1 − 0.3875 + 0.2167 = 0.5592 ( , )=1− ( , )∪( , ) , + ( , )∩( , ) , = 1 − 0.3875 + 0.2167 = 0.5592 (ix) ( , ) = 1 − ( , ) ∪ ( , ) c, + ( , ) ∪ ( , ) , c = 1 − 0.3875 + 0.2167 = 0.5592 (x) (xi) M11 ( F, A) = 1 − d(( F, A) ∩ ( F, A) , U ) + d(( F, A) ∩ ( F, A) , φ) = 1 − 0.3875 + 0.2167 = 0.5592 ( , )=1− ( , )∩( , ) , + ( , )∩( , ) , = 1 − 0.3875 + 0.2167 = 0.5592 (xi) Next, we give an illustrative example from the field of pattern recognition which are stated and Next, we give an illustrative example from the field of pattern recognition which are stated and demonstrated as below. demonstrated as below.

5.1. The Scenario 5.1. The Scenario AA type of robot has a single eye capable of capturing (and hence memorizing) things it sees as type of robot has a single eye capable of capturing (and hence memorizing) things it sees as anan 850850 × ×640, was shown shownan anobject object(a(apillow pillowwith with smiley), and 640,2424bit bitbitmap bitmapimage. image. The The robot robot was aa smiley), and thethe image that was captured is shown shownininFigure Figure1.1.This This image was saved image that was capturedby bythe therobot’s robot’seye eyeat at that that instant instant is image was saved as as pic001.bmp in the memory of the robot. pic001.bmp in the memory of the robot.

Figure 1. The image of the object captured by the robot. Figure 1. The image of the object captured by the robot.

The robot was then given a way (in this example, it is done by human input) to recognize the The whenever robot wasthe then given a way (in example, it retrieving is done bythe human to recognize object, robot encounters thethis object again, by colors input) at certain coordinatesthe object, the robot the object again, retrieving the colors certain coordinates of itswhenever field of vision, and encounters then comparing this with thebysame coordinates fromatimage pic001.bmp of stored its fieldinofitsvision, andIn then comparing this with thethe same coordinates image pic001.bmp memory. order to distinguish noises, coordinates are from chosen in clusters of four,stored as in shown its memory. In order distinguishofnoises, the coordinates areare chosen in clusters of four, in Figure 2. Thetocoordinates the clusters of the images summarized in Table 1. as shown in Entropy Figure2018, 2. The coordinates of the clusters of the images are summarized in Table 1. 20, x 10 of 19

Figure 2. The clusters of the image pic001.bmp for recognition purposes. Figure 2. The clusters of the image pic001.bmp for recognition purposes. Table 1. The coordinates of the clusters for image pic001.bmp from Figure 2.

“Left Eye” (LEn)

1st Position (n = 1) (323, 226), (324, 226), (323, 227), (324, 227), (486, 226), (487, 226),

2nd Position (n = 2) (301, 252), (302, 252), (301, 253), (302, 253), (464, 252), (465, 252),

3rd Position (n = 3) (345, 252), (346, 252), (345, 253), (346, 253), (509, 252), (510, 252),

Entropy 2018, 20, 403

11 of 19

Table 1. The coordinates ofthe theimage clusters for imagefor pic001.bmp Figure 2. Figure 2. The clusters of pic001.bmp recognitionfrom purposes. 1st Position = 1) Position (n = 2) from 3rd Position (n = 3) Table 1. The coordinates of the(nclusters for2nd image pic001.bmp Figure 2. (323, 226), (324, 226), 1st Position (n = 1) (323, 227), (324, 227), (323, 226), (324, 226), “Left Eye” (LEn) (486, 226), (487, 226), (323, 227), (324, 227), “Right Eye” (REn ) (486, 227), (487, 227), (486, 226), (487, 226), “Right Eye” (REn) (284, 119), (285, 119), (486, 227), (487, 227), “Left side of Face” (LFn ) (284, 120), (285, 120), (284, 119), (285, 119), “Left side of Face” (LFn) (284, 120), (285, 120), (407, 168), (408, 168), “Centre of Face” (CFn ) (407, 169), (408, 169), (407, 168), (408, 168), “Centre of Face” (CFn) (407, 169), (408, 169), “Right side of Face” (553, 120), (554, 120), (553, 120), (554, 120), (RF ) (553, 121), (554, 121), “Right siden of Face” (RFn) (553, 121), (554, 121), (581, 404), (582, 404), “Tongue” (Tn ) (581, 404), (582, 404), (581, 405), (582, 405), “Tongue” (Tn) (581, 405), (582, 405), (274, 403), (278, 407), “Mouth” (Mn ) (274, 403), (278, 407), (282, 411), (286, 415), “Mouth” (Mn) (282, 411), (286, 415), “Left Eye” (LEn )

(301, 252), (302, 252), 2nd Position (n = 2) (301, 253), (302, 253), (301, 252), (302, 252), (464, (301, 252), 253), (465, (302, 252), 253), (464, 253), (465, 253), (464, 252), (465, 252), (167, (464, 312), 253), (168, (465, 312), 253), (167, 313), (168, (167, 312), (168, 313), 312), (167, 262), 313), (407, (168, 262), 313), (406, (406, (406, 263), 262), (407, (407, 263), 262), (406, 307), 263), (672, (407, 307), 263), (671, (671, 307), (672, 307), (671, 308), (672, 308), (671, 308), (672, 308), (562, 429), (563, 429), (562, 430), 429), (563, (563, 430), 429), (562, (562, 430), (563, 430), (393, 469), (401, 469), (393, 469), (401, 469), (409, 469), (417, 469), (409, 469), (417, 469),

(345, 252), (346, 252), 3rd Position (n = 3) (345, 253), (346, 253), (345, 252), (346, 252), (509,253), 252),(346, (510,253), 252), (345, (509, 253), (510, 253), (509, 252), (510, 252), (275,253), 519),(510, (276,253), 519), (509, (275, 520), (276, 520), (275, 519), (276, 519), (275, (406,520), 363),(276, (407,520), 363), (406,363), 364),(407, (407,363), 364), (406, (406, (562,364), 521),(407, (563,364), 521), (562, 521), (563, (562, 522), (563,521), 522), (562, 522), (563, 522), (598, 430), (599, 430), (598, (598,430), 431),(599, (599,430), 431), (598, 431), (599, 431), (553, 395), (556, 389), (553, 395), (556, 389), (559, 383), (562, 377), (559, 383), (562, 377),

Remark: For a computer image, the top-leftmost pixel is labeled (0, 0).

Remark: For a computer image, the top-leftmost pixel is labeled (0, 0).

WeWe now have three B and and image imageC. C.The Therobot robotneeds needs recognize now have threeimages, images,namely namelyimage image A, A, image image B toto recognize if the objects shown as the the image imageshown shownininimage imagepic001.bmp pic001.bmp stored if the objects shownininimage imageA, A,BBand andCCis is the the same same as stored in in thethe robot’s memory. Images A, B and C are shown in Figures 3–5, respectively. For comparison robot’s memory. Images A, B and C are shown in Figures 3–5, respectively. For comparison purposes, pic001.bmp purposes, pic001.bmpisisshown shownalongside alongsideall allthe the three three images. images.

Entropy 2018, 20, x

Figure 3. Image A (left) and the original image pic001.bmp (right). Figure 3. Image A (left) and the original image pic001.bmp (right).

Figure 4. Image B (left) and the original image pic001.bmp (right). Figure 4. Image B (left) and the original image pic001.bmp (right).

11 of 19

Entropy 2018, 20, 403

12 of 19

Figure 4. Image B (left) and the original image pic001.bmp (right).

Figure 5. Image C (left) and the original image pic001.bmp (right). Figure 5. Image C (left) and the original image pic001.bmp (right).

From a human perspective, it is clear that the object shown in image A will be recognized as the From a human it is clearand thatitthe shown in image A willisbe recognized asBthe same object shownperspective, in image pic001.bmp, willobject be concluded that the object shown in image (a red airplane) is not the image pic001.bmp stored in the memory of the robot. No conclusion can be B same object shown in image pic001.bmp, and it will be concluded that the object is shown in image fromisimage C image as it is pic001.bmp made up of stored only noise, therefore werobot. are unable to deducecan the be (a deduced red airplane) not the in theand memory of the No conclusion exact object behindCthe By up retrieving the coordinates from 1, we tonow obtain deduced from image as itnoise. is made of only noise, and therefore weTable are unable deduce thethe exact following sets of colors which are given in Table 2. object behind the noise. By retrieving the coordinates from Table 1, we now obtain the following sets of colors which are given in Table 2. Entropy 2018, 20, Entropy 2018, 2018, 20,x20, x xx Entropy 2018, 20,Entropy x 2018, Entropy 20,

LE

12 of 19

Table 2. The sets of colors for image A, B, C and image pic001.bmp.

12 of 19 19 of of 1212of12 1919

Table 2. The sets of colors image A, B,pic001.bmp. C and image pic001.bmp. The sets colors for image A, B, C and image pic001.bmp. Table 2. TheTable sets colors for image A, B,for C and image 2.of2.The sets ofof colors for image pic001.bmpTable (Memory) Image Aimage A, B, C andImage Bpic001.bmp. Image C pic001.bmp pic001.bmp (Memory) (Memory) Image Image A A Image Image B B Image Image CC pic001.bmp (Memory) B pic001.bmp (Memory) n=1 n=2 nImage = 3 An =Image 1 nA= 2 n Image =3 n = Image 1 nB =2 n = 3Image n =C1 Image n = 2C n = 3 n = n 1 = 1 n = n 2 = 2 n = n 3 = 3 n = n 1 = 1 n = n 2 = 2 n = n 3 = 3 n = n 1 = 1 n = n 2 = 2 n = n 3 = 3 n = n 1 = 1 n = n 2 = n=1 n n= =2 1 n n = 3= 2 n =n 1= 3 n =n2= 1 n =n 3= 2 n n= =1 3 nn==21 nn==32 nn==13 nn==21 nn==32 2 n n= =3n3= 3 LE LE LE LE

RE

RE RE RE RE

LF

LF LF LF LF

CF

CF CF CF CF

RF

RFRF RF RF

T

TTT T

M

MMMM

The The luminosity luminosity and and hue hue of the the pixels pixels are are obtained obtained using using apicture picture a picture editing editing program, program, and and these these The luminosity and hue of the pixels are obtained using a using picture program, and these The luminosity hue ofof the pixels are obtained using aediting editing program, these The and hue of the pixels are obtained a picture editing program, andand these are are are given given in Tables Tables and 3 respectively. and 4, respectively. respectively. are givenare in Tables 3in and respectively. given in Tables 3 3and 4,4,respectively. given in Tables 34,and 4, Luminosity, Luminosity, , where , where ∈∈∈ {LE, {LE, RE, RE, LF, LF, CF, CF, T,b{0, T, M}, b3} ∈ {0, {0, 1, 1, 2, 3} 2, 3}of(cluster 3} (cluster (cluster offour of four four , , , , , , , , , Luminosity, ∈ {LE, LF, CF, RF, T, M}, bRF, ∈T, 1,bb2, (cluster four Luminosity, , where {LE, RE, LF, CF, RF, M}, 1, 2, Luminosity, k ∈RE, {LE, RE, LF, CF, RF, T,RF, M}, ∈ M}, {0, 1,∈∈ 2,{0, 3} (cluster of fourof pixels). , , , , where , , , ,where m,k,n,b pixels). pixels). Hue, Hue, ℌ ℌ , where , where ∈ ∈ {LE, {LE, RE, RE, LF, LF, CF, CF, RF, RF, T, T, M}, M}, b ∈ b ∈ {0, {0, 1, 1, 2, 2, 3} 3} (cluster (cluster of of four four pixels). pixels). , ,,, k,, ∈ ,, pixels). Hue, ∈ RE, {LE,LF, RE, RF, M}, 1,b2,∈ 3} (cluster four of four pixels). Hue, ℌ , where ∈ LF, {LE, RE, LF, RF, T, M}, {0,four 1, 2,pixels). 3}of(cluster , , ,where Hue, ℌ CF, RF,CF, T, M}, b T, ∈CF, {0, 1,b2,∈ 3}{0, (cluster of , , ,{LE, m,k,n,b, , ,where pixels). pixels). 5.2. 5.2. Formation Formation of of CVSS CVSS and and Calculation Calculation of of Entropies Entropies

5.2. Formation ofLet CVSS and Calculation of Entropies 5.2. Formation of{=CVSS Calculation of{LE, Entropies Let = {, , , and , }, }, and now form three CVSSs , ), ∈ ∈ (ℱ(ℱ, ), and == {LE, RE,RE, LF,LF, CF,CF, RF,RF, T, T, M}.M}. WeWe now form three CVSSs which denote image ,= LF, , {LE, and and , LF, respectively , T, respectively using using thethe formula formula given given below: (ℱCVSSs ), (ℱ Let {1,2,3}, ={1,2,3}, {Let, which ,= },{ and {LE, RE, CF, RF, M}.RF, WeT,now form three CVSSs ,below: ∈ , ), ∈ , denote , = },image and RE, CF, M}. We now form three {1,2,3}, which denote image and , , respectively using the using formula below: {1,2,3}, which denote, image and , respectively thegiven formula given below: ,( , ,(,() ,, )) ,( , ,(,() ,, )) ⎡ ,( , ,(,() ,, )) ,)) = e e ℱ ℱ ( ( ),( =

,( , )

,(⎢ , ,(,() ,, ))

, e, e

⎢ ⎢

,( , ,() , ) , ⎤ ,,

,( , )⎡

⎢ ⎢

e e

,

,

⎥ ⎥ ⎥

,( , )

⎤ ⎥ ⎥

, ,

Entropy 2018, 20, 403

13 of 19

Table 3. The values of the luminosity for image A, B, C and image pic001.bmp. pic001.bmp (Memory), m=0 n=1

n=2

Image A m=1

n=3

n=1

n=2

Image B m=2 n=3

n=1

Image C m=3

n=2

n=3

n=1

n=2

n=3

LE

23 24

23 24

23 24

24 24

24 24

23 25

24 25

27 28

34 38

33 39

32 31

32 31

5 6

6 6

97 116

96 110

74 81

72 80

27 25

128 82

57 78

54 107

180 120

2 26

RE

24 23

23 24

21 22

24 21

23 24

24 22

43 45

44 46

41 41

42 43

47 48

46 48

7 6

7 6

8 8

7 8

8 7

7 7

28 120

112 64

48 0

0 67

2 0

120 97

LF

101 96

99 100

104 106

105 106

90 91

90 88

78 78

79 78

55 55

57 56

96 95

96 95

3 3

3 3

162 163

163 163

122 125

120 122

58 65

63 67

24 14

31 61

62 40

24 96

CF

85 88

88 89

80 81

83 82

78 78

79 78

97 97

102 98

103 103

104 104

102 99

101 100

6 5

5 5

79 81

77 81

24 33

24 32

49 60

28 81

0 3

22 139

109 27

56 96

RF

83 84

82 84

64 65

64 63

60 59

59 59

119 120

119 119

136 135

136 134

139 138

141 139

6 6

6 6

16 16

15 14

62 61

63 64

98 64

40 0

93 20

94 30

56 45

16 44

T

60 59

60 61

59 61

59 62

58 60

59 59

142 144

143 144

144 145

144 144

152 150

151 148

116 120

117 119

27 25

27 24

127 125

127 125

74 120

28 25

0 104

75 3

81 0

198 75

M

26 25

23 26

15 13

14 13

9 8

9 8

19 19

21 20

34 43

40 40

33 36

35 36

81 86

87 93

175 181

191 145

36 79

42 66

77 113

22 84

112 121

15 115

14 31

24 33

Entropy 2018, 20, 403

14 of 19

Table 4. The values of the hue of the pixels for image A, B, C and image pic001.bmp. pic001.bmp (Memory), m=0 n=1

n=2

Image A m=1

n=3

n=1

n=2

Image B m=2 n=3

n=1

Image C m=3

n=2

n=3

n=1

n=2

n=3

LE

12 13

12 12

10 9

10 10

12 13

12 12

17 17

17 17

15 15

15 15

18 18

16 16

187 160

160 160

8 8

8 8

6 6

7 7

55 214

80 173

23 127

160 70

226 168

27 66

RE

13 13

13 13

13 13

13 12

12 12

11 11

19 19

19 19

18 18

18 18

20 19

20 20

160 160

160 160

160 160

160 160

160 160

160 160

112 152

64 150

227 160

160 177

220 160

119 77

LF

31 31

31 30

31 31

31 31

31 31

30 30

28 29

28 28

27 27

27 27

31 31

31 30

187 187

187 187

173 171

167 167

7 9

7 7

12 23

167 237

169 18

67 76

57 186

211 199

CF

29 29

29 29

29 29

28 29

29 29

29 29

30 30

29 29

30 30

30 30

31 30

30 30

160 180

160 180

8 8

8 8

224 213

230 220

155 42

86 51

160 200

214 107

177 97

0 165

RF

29 30

30 30

29 29

29 29

31 31

31 31

30 30

30 30

32 32

32 32

33 32

33 33

160 160

160 160

160 160

160 153

139 137

139 141

154 68

5 160

238 192

68 76

94 131

36 158

T

11 11

11 11

11 9

12 9

10 9

10 10

12 12

12 12

12 11

12 12

13 13

12 12

13 9

9 13

168 164

165 160

7 8

7 10

119 29

212 205

160 31

45 67

160 160

130 200

M

11 11

10 12

13 13

13 13

18 14

14 18

16 15

15 16

18 21

20 19

17 16

16 17

10 13

14 14

19 17

18 15

208 184

183 5

145 42

118 29

195 49

26 181

124 115

160 220

Let

(ℱ , CVSSs ), ∈ (ℱ , = { , Let, },=and { , = , {LE, }, and RE, LF, = CF, {LE,RF, RE,T,LF, M}.CF, WeRF, now T, M}. formWe three now CVSSs form three

Table 2. The Table sets of 2. The colors sets forofimage colors A, forB,image C and A, image and pic001.bmp. image Table pic001.bmp. 2. Image The Table sets of 2.Image The colors forofimage colors forB,Image image C andA, B, C and pic00 (Memory) (Memory) Image Image A3given Image pic001.bmp BB,Image pic001.bmp (Memory) B (Memory) Image A sets A A, Bimage Image B im are 3given inpic001.bmp Tables respectively. are Tables 3C and 4, respectively. re given in Tables and 4, respectively. are given in Tables and 4,in respectively. (ℱ ),Cbelow: {1,2,3}, Let = { {1,2,3}, , 3pic001.bmp , and }, 4, and = {LE, RE, A LF, CF, RF, T, {1,2,3}, M}. Weand now form three CVSSs , Image which denote which image denote , the image and ,C, respectively and , ∈respectively using the formula using the give fo {1,2,3}, which denote which image denote , image and , respectively , , using respectively formula using given the formula given below: ) ,( , n = 1 n n = = 2 1 n n = 3 = 2 n = n 1 = 3 n = n 2 = 1 n = n 3 = 2 n n = = 1 3 n n n = = = 1 2 1 n n n = n = = 2 = 3 1 2 n n = n = 3 = 1 2 3 n = n n 1 n = = = 3 2 1 n = n 2 n = n = 1 = 3 n 2 = n 3 = 2 n = n 3 n = = 1 3 n n = = 2 1 n n = = 3 2 ) (cluster ,( M}, ,3} Luminosity, , where ∈ {LE, RE, LF, RF, Luminosity, T,,2, b ∈B {0,Image 1, 2, ,B3} where (cluster ∈ofthese {LE, four RE, RF, b ∈B {0, Image 1,,( fou 2,, Luminosity, where ∈ RE, LF,pixels CF, RF, Luminosity, T, bCF, ∈using {0, ,M}, 3} where (cluster ∈,of {LE, four RE, LF, CF, RF, T, M}, bCF, ∈⎡editing {0, of ⎡2, pic001.bmp (Memory) Image AM}, Image A Image pic001.bmp pic001.bmp (Memory) Image Image Care Image ALF,Image C A1,T, Image , (Memory) ,{LE, ,are , (Memory) ,program, , hue , luminosity , , pic001.bmp , 1, ,a The and hue of the are obtained The picture luminosity editing and hue of the and pixels using aprogram, picture editing The luminosity and of denote the pixels obtained using The arespectively luminosity picture editing and program, hue of the and pixels these are obtained using aobtained picture and thep {1,2,3}, which image , and , using the formula given below: LEHue, pixels). ∈pixels). LF, RF, M}, b==2∈11 {0, 3} ∈ {LE, RF, 1, n ℌ =LE 1 n∈,=n,{LE, 2=, 1, where nRE, =n3=LF, 2 nCF, =n1{LE, = RF, 3 n RE, =T, n2=M}, n ℌ =bCF, npixels). 3∈ = 2{0, nLE 1Hue, =, 2, n ℌ nn(cluster =LE n n ={LE, n=3, 22, =four 2, 1where n =nLF, 3=1=32 CF, nn=of n =1RF, =2four 31⎡ RE, nT,=nn2(M}, n==LF, =312n)bCF, =n 2=) 3n1, n2, 33}) ,((cluster nbn,=⎤,∈)=2e 1{0, n,n=e2, =3,(2, ⎡ of ⎢1=eM}, ℌ ,respectively. where Hue, 1, 3}Tables of RE, ∈(3⎢=n{0, fou , 1, ,= =⎡=T, , , 4, , in , =n , T, ⎡ n(cluster ℱ are Tables 3 and 4, respectively. are 3∈nand 4, respectively. ,( , ) ,(ℱ ,( , ) , ()= eixels). given Hue, in Tables 3given and are given in1Tables 3 given and 4,,3inwhere respectively. ) ⎤e ⎢ ⎢,( , ⎢ ⎢ ⎢ ( ) ⎥ ⎥ LE LE LE⎡ LE ⎢ pixels). pixels). ixels). pixels). ⎤ ) ) , ∈ ,( , LF, CF, Luminosity, ∈ {LE, RE, LF,bCF, RF, Luminosity, b ∈,( {0, 1, {LE, 2, 3} , RE, where (cluster of ∈ {LE, four RE,M}, LF,b CF, RF, b ∈ ⎢{0,of1, fou 2, Luminosity, , where ∈ {LE,, where RE, LF, CF, RF, Luminosity, T, M}, ∈ {0, 1, T, 2, M}, ,3}where (cluster of four RF, T, ∈ {0, 1,⎢ T, 2, M}, 3} (cluster RE , , ,

RE

, , ,

RE , , , ⎢

RE

, , ,



,( , )⎥

,⎢

⎣⎥,

⎢,( ⎣, )





,

⎢ (cluster ⎢ RE, ⎥⎤{0, )) ∈of ,( , )1,T, ,( ,( LF, , ) CF, ⎡ where ⎤b⎥ ∈ ⎢{0, of pixels). Hue, ℌ ∈pixels). {LE, CF, RF, M}, b ℌ ∈,( , ,{0, 1,, {LE, four LF,bCF, RF,1,⎢ T, 1, fou 2, xels). Hue, ℌ RE, LF, CF, RF, RE, T,Hue, M}, ∈pixels). {0, , ⎢3} where (cluster four RE, RF, T, M}, ∈15 2, M}, 3} (cluster ⎥∈ ⎡LF, b ℌ ⎡,Hue, ⎤,, 2, ⎤of⎡ {LE, 2018, 20, ∈ 403 , {LE, , , , where , ,3} ,Entropy , RE , , where , , 2, RE RE ofand ⎢ ⎢ ⎥ of 19 Formation of CVSS Calculation ,(of Entropies 5.2. of CVSS Calculation of Entropies 2. Formation of 5.2. CVSS and Calculation ofand Entropies 5.2. CVSS andRECalculation Entropies ⎢ ℱ ⎥ ) Formation of ) Formation , ,( , ⎢ ⎢ ⎥ ⎥ e⎥⎦ , ⎣ where: ⎣ ( ) ( ) where: = ℱ e = e , e , e e ⎡ ⎤LF ⎢ℱ ⎢,ee⎢ ⎥ ⎥ pixels).LF pixels). xels). ⎣ ⎣ ⎦ LF LF ( ) ( ) ℱ pixels). = e = , e e e , ( ) ( ) ⎥ ( ) ( ) ⎢CVSSs ⎢ ), LF, ⎥ ⎥ (ℱ three ⎢= ⎢LF ⎥),and ⎥three ⎣CVSSs (ℱ ={ = ,LF{LE, , (},RE, and = {LE, RE, LF, RF, now ={ = form , LF , three = (ℱ {LE,⎦⎥,T,RE, ∈ CF, RF,form T, M}. We CVSSs now form )= Let = { , , Let },LFand LF,⎢eCF, RF, T, M}. Let now { form ,T, M}. , Let }, and {LE, ,}, RE, LF, ∈ CF, RF, M}. We now three , ), C ℱ ,We e CF, eWe ( ) −∶ , , , ⎦, ∈ ∶{0,1,2,3} , ∈ {0,1 , ⎣ ⎣ ,( , ) = min ,( , ) = min , , , ⎦− ,, ,, ,, ⎢ ⎥ ⎣ ⎣ ⎦ ⎦ 5.2. Formation of,denote CVSS of{1,2,3}, Entropies 5.2. Formation of and Calculation Entropies . Formation of denote CVSS and Calculation ofofand Entropies 5.2. ofdenote CVSS and Calculation ofimage Entropies CF CF CF CFCVSS {1,2,3}, {1,2,3}, 5.2. Formation CVSS and of formula Entropies which image , Calculation and ,Formation respectively using the which formula given , andofusing , respectively using thebelow: formula given be 1,2,3}, which image and ,Calculation respectively using the which given image below: ,denote and , below: respectively the formula given



,( ,,( ) , )

where: ⎦ where:

CF CF where: where: ), (ℱ ),, ,) C ={U , CF , RE, }, and = RF, {LE,T,RE, LF, RF, T, We now = {) form , CF ,three }, and CVSSs =(ℱ {LE, , T,,(), RE, LF, ∈We CF, RF, T,)ℌM}. We)CVSSs Let = { , , Let },RF and = {LE, LF, CF, M}.Let WeCF, now = { form , M}. , three }, and CVSSs =(ℱ {LE, , RE, LF, ∈ ⎤ CF, RF, M}. form = ∶{0,1,2,3} , ,three ∈ ⎡Let ⎡ ,, ∈ ∶,(now ,,⎤, ), , ∈form ,( , now ,, ,,, −ℌ ,( , ) form ,( {0,1 Let three CVSSs (F , three ,( , ) LF, CF, , ) ,(T,, ⎤ M}. ,() ℌ , ) min ,( , )⎡ = min ,, , A , −ℌ RF= { x1 , x2 , x3 }, and A =⎡ {LE, RE, RF,(RF, RF We now where: ⎢ ⎥ ⎢ ⎢ ⎥ ⎢ ⎥ {1,2,3}, {1,2,3}, {1,2,3}, which denote image , and , respectively using the which formula denote given image below: , and , respectively using the formula given 2,3}, which denote image , and , respectively using the which formula denote given image below: , and , respectively using the formula given below: = min = − min ∶ , − ∈ {0,1,2,3} ∶ , ∈ {0,1,2,3} = max , = − max ∶, b , ,( using , )⎥ ,(, {0,1,2,3} ,, ), , below: , , , ∶ , ∈ ,({0,1,2,3} , ) ⎢ ∶ , , ,( , ), , , ,, , ⎥ , , given min −{0,1,2,3} ∶, ,RF the , ,(∈formula , , , ,(,, ,We = max ∈⎥, ,{0,1,2,3} A, B⎢ and C, {1,RF2, 3}, which , 6400 ,( , ) = min , ,(, ,, − ) = respectively , , , , ∶ ,⎢, , , ∈ , , , , ) = max , )− , ⎢, , , , , − , , , RF denote image RF choose and 6400 fors choose = = 6400 this ⎢ ⎥We ⎢ ⎢ ⎥ ⎢ ⎥ = for ,( ,( , ) , ) ,(6400 , ) =⎤and ) ) , ) ) ),( ,( , ,( , ,( , ,( , ⎡ ⎤ ⎡ ⎡ ,( , T) = min ⎤∶ , ⎢ ∈ {0,1,2,3} , ,( ⎡, ⎢ )T= max ⎡ T, , , − , , , ∶⎤ ⎥ , ∈ {0,1,2,3} ⎤ T , , , − , , , ⎢ ⎡ ,( , ) ,( ⎥⎤, ) ⎢⎡ , ⎡ ,( , ) ,( , ) ,( , ) ,( , ) ,( ⎤, )⎥ ,( , ) are given 5–7. scenario given ⎥{0,1,2,3} 2e "⎥ e , ⎢e⎣ ⎢# ⎥as ⎣5–7. ⎣⎢ as ∶in ⎦⎥ ⎥ ⎦ , scenario ⎢ = ,max e{0,1,2,3} ein Tables ⎥ ⎦ ℱ T (ℌ ℱ ( ) ( T) = ⎢e ℱT( ) ( ) ,= e ⎢e ℱ⎥ e((ℌγ∶) ( ⎢ ⎣ , )∈−ℌ = e= )⎢ )2∈ (θ ,( ,(),k,n ϑare 2= min 2,({0,1,2,3} −ℌ min ∶,( k,n −ℌ , , Tables ∈,({0,1,2,3} −ℌ max , e) ,= min ,, = )e , )= , , ,ℌ,(ℌ , ,,−ℌ , , , ℌ , ) = max , ), = , , . ,ℌ ∶, , T ) ( ) ( y ) ) = min ℌ −ℌ ∶ , , ∈ {0,1,2,3} max ∶ ∈ −ℌ ∶ . , , ℌ,(∈⎥, ,,{0,1,2,3} , , )) , , ,⎥, , ,) , , , , , , ⎢ ,(k,n ⎥) −( , ,(, $,, ⎥) ⎢ ), ⎢, , ⎥, , , , −( ,( , )$ ⎢ ,( , ) ⎥ −(⎢ ,(,(k,n ⎢ ⎥ ⎢ ⎢ , , ) ⎥ 2πi[e ) ]⎢ . ρ ,( , )) , e−( ⎢ ρ = max ⎥ ,e ⎢ ⎥ ) ∶ ,⎢ ∈ {0,1,2,3} , ) ,( , ) ) ,( ,min ,(n ,) ),,(= , ℌ , ⎡,⎣ (, x −ℌ ℌM, , ,(, −ℌ F , ⎤, ⎦We , , , ⎤ ⎦e ,( , M ) , , , ⎣⎡∶ ,,( , ∈) ,({0,1,2,3} ⎦⎤for this M ⎡ ⎣ ,( ,M) = ⎣⎡⎥ e ⎦⎤andfor (k) ⎢ choose ⎥= 6400 ⎢ ⎢ ⎢ that ⎥ CVSSs We =6400 choose 6400 and = 6400 =this 6400 scenario. The scenario. The thatfo C We choose We = 6400 choose and = 6400 = 6400 and for this = scenario. for this The scenario. CVSSs The were CVSSs formed that were for this formed ⎢ ⎥ ⎢ ⎥ ⎢ ⎥ ⎢ ⎥ ⎣ ⎦ ⎣ ⎣ ⎦ ⎣ ⎦, ( ) ( ) ( ) ( ) ℱ = e , e e ℱ = e , , e e ℱwhere: = e ,e e ℱ = e, ,e e M M ( )= 6400 and (in ) as ( ) WeMchoose M ( ) are where: where: where: scenario scenario as given are Tables given 5–7. in Tables 5–7. = 6400 for this scenario. The CVSSs that were formed for this ⎢ ⎥ ⎢ ⎥ ⎢ ⎥ ⎢ ⎥ scenario are as scenario given in areTables as given 5–7.in Tables 5–7. ⎣ hue ⎦ = ⎣ program, ⎦these ⎣ asluminosity ⎦ , hue ⎣− a−luminosity ⎦ hue where: The The luminosity and of and of the are pixels obtained are∶ obtained using picture using The luminosity and editing of and the program, and pixels ofthese the are and pixels obtained are using a picture using∶ aediti scenario are given 5–7. ∶,( the ∈pixels {0,1,2,3} ,∈picture − hue {0,1,2,3} , pic ∈ ∶ ,in ∈,Tables {0,1,2,3} max , , The ,,∶ ∶, aediting ∶ obtained , ∈, {0,1,2,3} , , − , , ,, ,,( , , ), = , , min , {0,1,2,3} ,∈, {0,1,2,3} , , , , , ∶,( , ,) =∈max , , − , ,,, ,( , ) = min , , , −,( , , ) ,=, min , ) = , ,, ,( , − ,,( ) , = ,) ,min , max , ∈, ,{0,1,2,3} , ,, , −,( , , ) , = , max are given are in Tables given in 3 and Tables 4, respectively. 3 and 4, respectively. are given are in Tables given in 3 and Tables 4, respectively. 3 and 4, respectively. where: where: here: where: The luminosity The luminosity and hue of and pixels of the arepixels obtained obtained Thea picture luminosity using editing a picture luminosity and program, hue editing of theprogram, and hue pixels these of the are and pixels obtained these obtained a picture using ed a thehue and oare using o are using n n The Luminosity, , , where where ∈respectively. RE, LF, {LE, CF, LF, CF, Luminosity, b given T, 1, 2, (cluster 2, 3} where ∈ {LE, RE, ∈ {LE, four CF, LF, CF, M}, T, M} min {0,1,2,3} = −ℌ ∶ ,∈ ℌ, −ℌ .respectively. {0,1,2,3} −ℌ , {0, ∈ :∈,4, p, q{LE, ∈ ,γ p, q(cluster ∈four 1, 2, , , ℌRF, y ,(Luminosity, min = max − }, max ℌ ∶ , ℌ ∈,k,n,p ,,respectively. max ℌ { −ℌ min ∶RE, ℌ are ,ℌ ∈,, given −ℌ .,∶Tables ∈{0, {0,1,2,3} ,where = ℌ ∶RE, ∈ given in given in Tables 3∶,(, and in and Tables 4, 4,of respectively. , ,)are ,3 ,{0,1,2,3} ,and ,( )2, ,= ), 3 ,{0,1,2,3} , , )M}, ,are ,min ,M}, ,∈3 ,in ,, ,3} ,{0, ,( of , LF, , {0,1,2,3} , , T, , RF, , . , b ∶∈ , −ℌ ,− , ,4, ,,) ,= ,{0,1,2,3} ,b , ∈ , 1, ,,)and , :∈,max ,( , ) = min are ,k,n , −ℌ ,=, Tables , , ,,0, ,( , 1, ,∈ ,} , max ,k,n ,= , RF, ,(∶ 3 , ,{, ,0, , −ℌ ,) 3 ,= 0,k,n,q ,k,n,p 0,k,n,q ),( = ,RF, (Luminosity, ),,(T, ∶ , ,) =∈max {0,1,2,3}, , , , − max {0,1,2,3} ∶ , ∈ ∶ , ∈, {0,1,2,3} ∶ , ,∈,, {0,1,2,3} ∶ ∶ , , ∈ ∈ ∶ , ∈ ,{0,1,2,3} , , − , ,, , , − ,( ,, ,), ,= ,, , min ,{0,1,2,3} , {0,1,2,3} , − , ,, ,, ,(∶ , ), =∈max , , − , ,, , ,( , ) = min , , , − ,( ,, ), = , min ,( ,( ,(, ,), )= ,= , min ,, − , , , , − ,( ,, ), = , max pixels). Hue, pixels). ℌ Hue, ∈, ∈ {LE, RE, ∈RE, {LE, LF, CF, pixels). RE, RF, LF, T, Hue, CF, pixels). RF, ℌ bRF, T, Hue, {0, M}, 1, ℌ ,∈where 3}{0, 1, 2,,, ,2, where ∈ 3}, of {LE, four RE, ∈ RE, of {LE, LF, CF, RE,CF, RF, LF,RF, T, CF, M}, RF, b RF, ∈ T,b {0, M}, 1 Luminosity, Luminosity, {LE, ∈ LF, {LE, CF, RE, RF, LF, Luminosity, T,M}, CF, M}, b∈ Luminosity, ∈were T, {0, M}, 3} , (cluster {0, where (cluster 3} ,(cluster of ∈where (cluster {LE, four of ∈four LF, {LE, four RE, LF, T,CF, M}, ∈T, , ℌ , , , , , , where , , ,, ,, ,where , , b1, , 2, , , ,1, , b,2,,∈ 6400 and 6400 forwhere this scenario. The choose CVSSs that 6400 and formed = 6400 for this this scenario. CVSSs that We choose = We 6400choose and == 6400 for this, =where scenario. The choose CVSSs = that 6400 were and formed == 6400 for for this this scenario. The CVSSs that were formed forwer th{ We for oWe oThe n n pixels). pixels). pixels). pixels). pixels). Hue, , where , ∈where RE, ∈scenario LF, pixels). RF, LF, T, CF, Hue, M}, pixels). RF, ℌ T, Hue, {0, M}, 1,, − 2,,, ∈ where {0, (cluster 1,, , 2, ,3} ∈where of {LE, RE, ∈of LF, four LF,T,CF, M},RF, { ,− , ℌ , , :max , {0,1,2,3} ,b ℌ , {0,1,2,3} ℌare −ℌ , ∈ ,ℌ ℌ∈RF, ∶b ∈ , T,∈ p, qℌ{LE, ∈ 3{LE, ,ℌ ϑ,RE, :max p, qℌ(cluster ∈four 1, 3,{LE, . ,RE, θpixels). =, Hue, min = }max }max = min in ℌ ∶ ℌ −ℌ ∶ in CF, {0,1,2,3} ∶ ∶b ∈∈ ,, ,3} −ℌ ∶ CF, ,inℌ∈Tables scenario 5–7. given in Tables 5–7. )given , {0,1,2,3} , , −ℌ ,, ,, ,0,k,n,q ,(, given ,)2, ,( ,, max ), = , {0,1,2,3} , {0,1,2,3} , −ℌ ,.0,k,n,q ,( , ,2, ), = , {0,1,2,3} , , −ℌ , ., , cenario are,(as Tables are as Tables 5–7. , ) given ,k,n , −ℌ , 5–7. ,= , min ,(∶ , scenario ), =∈ , { , , ,0, ,( 1, , )= ,= ,min , ,−ℌ , . , min ,(∶ , ), = ∈ , ,{,, 0, ,k,n,p ,k,n,p ,(are ) ,(as ,∈(,k,n )as pixels). pixels). pixels). pixels). We =of 6400 and = 6400 for this scenario. We The choose CVSSs that 6400 were and = 6400 forCalculation this scenario. that We choose = 6400 and =Formation 6400 for scenario. We The choose CVSSs that = 5.2. 6400 were and formed = 6400 for this for this scenario. The CVSSs that The wereCVSSs formed forwe th 5.2. choose Formation CVSS and ofthis CVSS Calculation and of Entropies of Entropies Formation 5.2. CVSSs Formation of= CVSS and offormed CVSS Calculation and offor Entropies ofscenario Entropies We 5.2. choose ρ= 6400 and ρ =Calculation 6400 for this scenario. The that were formed forthis this scenario are as given in Tables 5–7. scenario are as given in Tables 5–7. nario are as given inare Tables 5–7. scenario are as given in Tables 5–7. 5.2.as Formation 5.2. Formation of,Tables CVSS of, CVSS Calculation Calculation of Entropies of RF, Entropies 5.2. Formation 5.2. Formation and CVSS and of ofRF, Entropies given 5–7. (ℱ ),=Entropies (ℱ ), Let = {in Let , = }, {and and , = }, and and {LE, RE, = LF, {LE,CF, RE, LF,T,CF, M}. Let RF, WeT,now =M}. {Let form We ,of CVSS , now =three }, { and form , of CVSSs ,Calculation three = }, and {LE, CVSSs , Calculation RE, LF, {LE, ∈ CF, , RE, LF, ∈T,CF, M}.RF, WeT,now M}. form We no th {1,2,3},Let {1,2,3}, {1,2,3}, {1,2,3}, which which , respectively ,RE, respectively using which using formula denote which formula image denote given and below: respectively and , ), respectively using theRF, using formula theform giv for = denote { Let , , image =denote },{ and , ,,image =},and {LE, and, RE, =and LF, {LE, CF, RF,LF, T, CF, M}.the Let RF, We T, now = M}. {the Let form We ,given now ,three = },below: { form and CVSSs , , image ,three = },(ℱ{LE, and CVSSs ,, , ), RE, =(ℱ LF, ∈{LE, ,CF, RE, RF, LF, ∈T, CF, M}. We T, now M}. We Table of (F1denote ,the Awhich )formula . image {1,2,3}, which {1,2,3}, {1,2,3}, {1,2,3}, denote which image denote, image and 5., ,Tabular respectively and representation , respectively using the which formula using given denote below: given , image below: and , , respectively and , respectively using the formula using theg ⎡ ⎡ ⎤ ⎤ ⎡ ⎡ ,( , )

,( , ,() , )

,( , )

,( , )

⎢ ⎢ ⎥ ⎥ ⎢ ⎢ ⎡ ⎢ ⎡ ,( , ,() ,, ) ⎥ ⎤ ,( , ) ⎥ ⎤ n ,( ,⎢⎢ ) ⎡⎢ , ⎢ )⎢ ⎥ ⎥ ⎢ ) ⎢ ) ) ,( , ) ) ⎥ ⎥ ) ,( , ) ,( , ,( , ,( , ,( , ,( , ,( , ⎡ ⎡ ⎤ ⎤ ⎡ ⎥⎥ ⎡ ⎥ ⎥ ⎤ ⎤ ⎢ ⎢ ⎢ ⎢ , ⎢ ⎢ , 1 2 3 ,( , ),,(e , )⎥ e ⎤ ⎣ ,( ⎢, )⎥ e ℱ ( )( )ℱ = ⎢( e)⎡( ) =,( ⎢,e⎡) , e ,( , ),( , ,e) ⎥ e⎤ ⎣,( ⎢, ) ⎥ e⎤ ⎣ ℱ⎢ ( ) ( ) ℱ = ⎢( e)⎦(,⎡⎥ ) = ⎢,(e⎦,, ⎡)⎥, e ⎢ ⎢ ⎢ ⎥ ⎢ ⎢ ⎥ ⎥ 2πi[0.994, ⎢ ⎥ ⎢ ⎥ ⎥ 0.996] ( [0.987, ⎢e ( 2πi[0.996, ⎥e ⎣ ℱ ⎢⎦e, , e 0.998] , e ⎥ e ⎣ ⎣ ( )= ) = ⎢e , 0.997] )= ) = 2πi[0.994, ℱ1.000]e e , e ⎥ e0.987]e ℱ ⎢⎦e, (0.994]e LE ℱ [0.996, [0.960,

⎢ ,(⎢ , ),⎡ ⎢ ⎢ ⎢ ⎢ ⎤ ⎣ ⎢⎢ ⎥e ⎣

,( ,

,

⎣ ⎢( ) ⎣ ⎢ ⎦⎥ ⎦ ⎥ ⎣ ⎢( ) ⎣ ⎢ ⎦ ⎥ ⎦ ⎥ ( ) ( ) RE [0.920, 0.945]e2πi[0.994, 0.994] [0.927, 0.955]e2πi[0.994, 0.996] [0.899, 0.927]e2πi[0.987, 0.992] ⎣ ⎣ ⎦ ⎦ ⎣ ⎣ ⎦ ⎦ where: where: where: where: LF [0.920, 0.955]e2πi[0.998, 0.999] [0.666, 0.708]e2πi[0.997, 0.997] [0.990, 0.997]e2πi[0.999, 1.000] 2πi[0.999, 1.000] 2πi[0.999, 0.999] 2πi[0.999, 0.999] where: where: where: where: k CF 0.990]e 0.939]e − , ∈, ,{0,1,2,3} ∶ , ,∈ [0.913, {0,1,2,3} , ,( , ), = −, ) = −,, ,) ,= ∈,[0.913, min − ∶ , , ,0.939]e ∶ , {0,1,2,3} −, ∈, ,{0,1,2,3} , ,( , ), = − , , , , , ∶, ,( , ) = min[0.955, ,( , ,) ,=, min , , , , , ∶, − , ,( , ) = max , ,,(max , , min , , , ∶ ,,( , , {0,1,2,3} , , , ,,∈ , , ∶ , ,∈ {0,1,2,3} ,( , ) = max , , max 2πi[0.999, 1.000] 2πi[0.998, 0.998] 2πi[0.999, 0.999] RF = min [0.798, 0.825]e [0.434, 0.475]e [0.349, 0.386]e min, , , , ∶, , − , ∈, {0,1,2,3} ∶ , , ∈ {0,1,2,3} , ,( , ,) , =, max −,( , ), = ,,( , ∈ = min ∶ , ,, ,, ∈ , − , ∈, {0,1,2,3} , ,( , ,) , =, max − , , , ,( , ) ,( , , ), = , − , , ,( , ) = max , , min , ∶, , − , ), , {0,1,2,3} ,, ,− , ∶, {0,1,2,3} , ,, ∶ , , ∈ {0,1,2,3} ,( , ) = max 2πi[0.999, 0.999] 1.000] 0.999] −ℌ , ∈, , {0,1,2,3} ∶ , , ∈[0.314, {0,1,2,3} , ,(ℌ, ),2πi[0.998, = −ℌ −ℌ min −ℌ ∶ , ℌ, ,0.298]e −ℌ , 2πi[0.997, ∈, , {0,1,2,3} , ,(ℌ, ), = −ℌ , ℌ, , , , ∶, 0.358]e 0.349]e ,( T , ) = min [0.323, ,(ℌ, ), = , , min ,ℌ , , , , ∶, −ℌ , ,( , ) = max , ,,(max , ) = ,ℌ ,min , , , ∶,,( ℌ ,, ), = ,∈ ,[0.251, , ,{0,1,2,3} , , , .,∈∶, {0,1,2,3} , . ∶ , , ∈ {0,1,2,3} ,( , ) = max , , max 2πi[0.994, 0.998] 2πi[0.990, 0.996] 2πi[0.998, 0.999] M [0.992, 0.945]e ℌ , , ) , =0.999]e min , ,ℌ, , ,∶ , −ℌ {0,1,2,3} ∶ , ,[0.868, ∈ {0,1,2,3} max , ,(ℌ , ,) ,=, max −ℌ = ℌ min −ℌ ,,(ℌ[0.884, ∈ = min ∶ 0.913]e , ,,ℌ,. ∈ , −ℌ , ∈, {0,1,2,3} ∶ , , ∈ {0,1,2,3} max , ,(ℌ , ,) ,=, max −ℌ , ℌ ,( , ) = ,( , −ℌ ,= , , 6400 ,( , this ) =We ,(The , choose ),= , ,CVSSs , ,∶ ,were ,and ,= ),that , {0,1,2,3} , ,formed , −ℌ , for ,∶{0,1,2,3} ,. for , ) = scenario. , , We choose We choose =min 6400 and = 6400 = 6400 and, ∈for this scenario. for scenario. The choose CVSSs We that 6400 6400 were = 6400 and formed for this = ,6400 this scenario. for this,(this The CVSSs The that CV scenario are scenario as given are in as Tables given in 5–7. Tables 5–7. scenario are scenario as given are in as Tables given in 5–7. Tables 5–7. We choose We = choose 6400 and = 6400 = 6400 and for = 6400 this scenario. for this We The scenario. choose CVSSs We The that = choose CVSSs 6400 wereand that = formed 6400 were = 6400 and for formed this for = 6400 this for scenario. this for this The scenario. CVSSs The t 6.5–7. Tabular representation of (F , Aare ). inasTables scenario are scenario as given areinasTables given 5–7. inTable Tables scenario are scenario as2given given 5–7. in Tables 5–7. N 1

k

LE RE LF CF RF T M

0.955]e2πi[0.008, 0.034]

[0.945, [0.950, 0.960]e2πi[0.034, 0.034] [0.222, 0.258]e2πi[0.021, 0.022] [0.332, 0.377]e2πi[0.028, 0.068] [0.386, 0.405]e2πi[0.068, 0.071] [0.559, 0.623]e2πi[0.999, 0.999] [0.465, 0.623]e2πi[0.997, 1.000]

2 0.444]e2πi[0.999, 0.999]

[0.258, [0.955, 0.973]e2πi[0.032, 0.034] [0.580, 0.612]e2πi[0.042, 0.055] [0.994, 1.000]e2πi[0.933, 0.939] [0.666, 0.708]e2πi[0.068, 0.090] [0.798, 0.852]e2πi[0.019, 0.032] [0.007, 0.071]e2πi[0.994, 0.999]

3 [0.591, 0.708]e2πi[0.992, 0.996] [0.955, 0.969]e2πi[0.031, 0.032] [0.807, 0.876]e2πi[0.913, 0.933] [0.623, 0.728]e2πi[0.001, 0.005] [0.996, 0.999]e2πi[0.150, 0.172] [0.475, 0.516]e2πi[0.998, 1.000] [0.454, 0.892]e2πi[0.002, 0.987]

Table 7. Tabular representation of (F3 , A). N 1

k

LE RE LF CF RF T M

0.999]e2πi[0.001, 0.759]

[0.178, [0.229, 0.997]e2πi[0.048, 0.666] [0.749, 0.876]e2πi[0.001, 0.992] [0.559, 0.997]e2πi[0.083, 0.973] [0.332, 0.969]e2πi[0.068, 0.913] [0.559, 0.973]e2πi[0.001, 0.950] [0.282, 0.999]e2πi[0.057, 0.955]

2 0.868]e2πi[0.028, 0.973]

[0.332, [0.718, 0.933]e2πi[0.000, 0.034] [0.266, 0.749]e2πi[0.051, 0.973] [0.340, 0.612]e2πi[0.004, 0.386] [0.728, 0.884]e2πi[0.001, 0.788] [0.548, 0.973]e2πi[0.028, 0.945] [0.161, 1.000]e2πi[0.005, 0.973]

3 [0.021, 0.999]e2πi[0.000, 0.969] [0.222, 0.939]e2πi[0.001, 0.516] [0.495, 0.996]e2πi[0.005, 0.899] [0.655, 0.955]e2πi[0.032, 0.876] [0.738, 0.998]e2πi[0.080, 0.996] [0.046, 0.965]e2πi[0.003, 0.105] [0.906, 0.996]e2πi[0.001, 0.229]

Entropy 2018, 20, 403

16 of 19

By using Definition 10, the entropy values for images A, B and C are as summarized in Table 8. Table 8. Summary of the entropy values for image A, B and C. Entropy Measure

Image A (F 1 , A)

Image B (F 2 , A)

Image C (F 3 , A)

M1 (Fi , A) M2 (Fi , A) M3 (Fi , A) M4 (Fi , A) M5 (Fi , A) M6 (Fi , A) M7 (Fi , A) M8 (Fi , A) M9 (Fi , A) M10 (Fi , A) M11 (Fi , A)

0.571 0.039 0.571 0.084 0.084 0.084 0.084 0.571 0.571 0.571 0.571

0.647 0.089 0.647 0.202 0.202 0.202 0.202 0.647 0.647 0.647 0.647

0.847 0.328 0.847 0.682 0.682 0.682 0.682 0.847 0.847 0.847 0.847

From these values, it can be clearly seen that Mi (F3 , A) > Mi (F2 , A) > Mi (F1 , A) for all i = 1, 2, . . . , 11. Hence it can be concluded that Image A is the image that is closest to the original image pic001.bmp that is stored in the memory of the robot, whereas Image C is the image that is the least similar to the original pic001.bmp image. The high entropy value for (F3 , A) is also an indication of the abnormality of Image C compared to Images A and B. These entropy values and the results obtained for this scenario prove the effectiveness of our proposed entropy formula. The entropy values obtained in Table 8 further verifies the validity of the relationships between the 11 formulas that was proposed in Section 4. 6. Conclusions The objective of this work is to introduce some entropy measures for the complex vague soft set environment to measure the degree of the vagueness between sets. For this, we define firstly the axiomatic definition of the distance and entropy measures for two CVSSs and them some desirable relations between the distance and entropy are proposed. The advantages of the proposed measures are that they are defined over the set where the membership and non-membership degrees are defined as a complex number rather than real numbers. All of the information measures proposed here complement the CVSS model in representing and modeling time-periodic phenomena. The proposed measures are illustrated with a numerical example related to the problem of image detection by a robot. Furthermore, the use of CVSSs enables efficient modeling of the periodicity and/or the non-physical attributes in signal processing, image detection, and multi-dimensional pattern recognition, all of which contain multi-dimensional data. The work presented in this paper can be used as a foundation to further extend the study of the information measures for complex fuzzy sets or its generalizations. On our part, we are currently working on studying the inclusion measures and developing clustering algorithms for CVSSs. In the future, the result of this paper can be extended to some other uncertain and fuzzy environment [59–68]. Author Contributions: Conceptualization, Methodology, Validation, Writing-Original Draft Preparation, G.S.; Writing-Review & Editing, H.G.; Investigation and Visualization, S.G.Q.; Funding Acquisition, G.S. Funding: This research was funded by the Ministry of Higher Education, Malaysia, grant number FRGS/1/2017/ STG06/UCSI/03/1 and UCSI University, Malaysia, grant number Proj-In-FOBIS-014. Acknowledgments: The authors are thankful to the editor and anonymous reviewers for their constructive comments and suggestions that helped us in improving the paper significantly. The authors would like to gratefully acknowledge the financial assistance received from the Ministry of Education, Malaysia under grant no. FRGS/1/2017/STG06/UCSI/03/1 and UCSI University, Malaysia under grant no. Proj-In-FOBIS-014. Conflicts of Interest: The authors declare no conflict of interest.

Entropy 2018, 20, 403

17 of 19

References 1. 2. 3. 4. 5. 6. 7. 8. 9. 10. 11.

12. 13. 14. 15. 16. 17. 18.

19. 20.

21. 22. 23. 24.

25. 26. 27.

Zadeh, L.A. Fuzzy sets. Inf. Control 1965, 8, 338–353. [CrossRef] Deluca, A.; Termini, S. A definition of non-probabilistic entropy in setting of fuzzy set theory. Inf. Control 1971, 20, 301–312. [CrossRef] Liu, X. Entropy, distance measure and similarity measure of fuzzy sets and their relations. Fuzzy Sets Syst. 1992, 52, 305–318. Fan, J.; Xie, W. Distance measures and induced fuzzy entropy. Fuzzy Sets Syst. 1999, 104, 305–314. [CrossRef] Atanassov, K.T. Intuitionistic fuzzy sets. Fuzzy Sets Syst. 1986, 20, 87–96. [CrossRef] Gau, W.L.; Buehrer, D.J. Vague sets. IEEE Trans. Syst. Man Cybern. 1993, 23, 610–613. [CrossRef] Atanassov, K.; Gargov, G. Interval-valued intuitionistic fuzzy sets. Fuzzy Sets Syst. 1989, 31, 343–349. [CrossRef] Szmidt, E.; Kacprzyk, J. Entropy for intuitionistic fuzzy sets. Fuzzy Sets Syst. 2001, 118, 467–477. [CrossRef] Vlachos, I.K.; Sergiadis, G.D. Intuitionistic fuzzy information—Application to pattern recognition. Pattern Recognit. Lett. 2007, 28, 197–206. [CrossRef] Burillo, P.; Bustince, H. Entropy on intuitionistic fuzzy sets and on interval-valued fuzzy sets. Fuzzy Sets Syst. 1996, 78, 305–316. [CrossRef] Garg, H.; Agarwal, N.; Tripathi, A. Generalized intuitionistic fuzzy entropy measure of order α and degree β and its applications to multi-criteria decision making problem. Int. J. Fuzzy Syst. Appl. 2017, 6, 86–107. [CrossRef] Liao, H.; Xu, Z.; Zeng, X. Distance and similarity measures for hesitant fuzzy linguistic term sets and their application in multi-criteria decision making. Inf. Sci. 2014, 271, 125–142. [CrossRef] Garg, H.; Agarwal, N.; Tripathi, A. A novel generalized parametric directed divergence measure of intuitionistic fuzzy sets with its application. Ann. Fuzzy Math. Inform. 2017, 13, 703–727. Gou, X.; Xu, Z.; Liao, H. Hesitant fuzzy linguistic entropy and cross-entropy measures and alternative queuing method for multiple criteria decision making. Inf. Sci. 2017, 388–389, 225–246. [CrossRef] Liu, W.; Liao, H. A bibliometric analysis of fuzzy decision research during 1970–2015. Int. J. Fuzzy Syst. 2017, 19, 1–14. [CrossRef] Garg, H. Hesitant pythagorean fuzzy sets and their aggregation operators in multiple attribute decision making. Int. J. Uncertain. Quantif. 2018, 8, 267–289. [CrossRef] Garg, H. Distance and similarity measure for intuitionistic multiplicative preference relation and its application. Int. J. Uncertain. Quantif. 2017, 7, 117–133. [CrossRef] Dimuro, G.P.; Bedregal, B.; Bustince, H.; Jurio, A.; Baczynski, M.; Mis, K. QL-operations and QL-implication functions constructed from tuples (O, G, N) and the generation of fuzzy subsethood and entropy measures. Int. J. Approx. Reason. 2017, 82, 170–192. [CrossRef] Liao, H.; Xu, Z.; Viedma, E.H.; Herrera, F. Hesitant fuzzy linguistic term set and its application in decision making: A state-of-the art survey. Int. J. Fuzzy Syst. 2017, 1–27. [CrossRef] Garg, H.; Agarwal, N.; Tripathi, A. Choquet integral-based information aggregation operators under the interval-valued intuitionistic fuzzy set and its applications to decision-making process. Int. J. Uncertain. Quantif. 2017, 7, 249–269. [CrossRef] Garg, H. Linguistic Pythagorean fuzzy sets and its applications in multiattribute decision-making process. Int. J. Intell. Syst. 2018, 33, 1234–1263. [CrossRef] Garg, H. A robust ranking method for intuitionistic multiplicative sets under crisp, interval environments and its applications. IEEE Trans. Emerg. Top. Comput. Intell. 2017, 1, 366–374. [CrossRef] Yu, D.; Liao, H. Visualization and quantitative research on intuitionistic fuzzy studies. J. Intell. Fuzzy Syst. 2016, 30, 3653–3663. [CrossRef] Garg, H. Generalized intuitionistic fuzzy entropy-based approach for solving multi-attribute decisionmaking problems with unknown attribute weights. Proc. Natl. Acad. Sci. India Sect. A Phys. Sci. 2017, 1–11. [CrossRef] Garg, H. A new generalized improved score function of interval-valued intuitionistic fuzzy sets and applications in expert systems. Appl. Soft Comput. 2016, 38, 988–999. [CrossRef] Molodtsov, D. Soft set theory—First results. Comput. Math. Appl. 1999, 27, 19–31. [CrossRef] Maji, P.K.; Biswas, R.; Roy, A. Intuitionistic fuzzy soft sets. J. Fuzzy Math. 2001, 9, 677–692.

Entropy 2018, 20, 403

28. 29. 30. 31. 32. 33. 34. 35. 36. 37. 38. 39. 40. 41. 42. 43. 44. 45. 46. 47. 48.

49. 50. 51. 52. 53. 54. 55.

18 of 19

Maji, P.K.; Biswas, R.; Roy, A.R. Fuzzy soft sets. J. Fuzzy Math. 2001, 9, 589–602. Yang, H.L. Notes on generalized fuzzy soft sets. J. Math. Res. Expos. 2011, 31, 567–570. Majumdar, P.; Samanta, S.K. Generalized fuzzy soft sets. Comput. Math. Appl. 2010, 59, 1425–1432. [CrossRef] Agarwal, M.; Biswas, K.K.; Hanmandlu, M. Generalized intuitionistic fuzzy soft sets with applications in decision-making. Appl. Soft Comput. 2013, 13, 3552–3566. [CrossRef] Garg, H.; Arora, R. Generalized and group-based generalized intuitionistic fuzzy soft sets with applications in decision-making. Appl. Intell. 2018, 48, 343–356. [CrossRef] Majumdar, P.; Samanta, S. Similarity measure of soft sets. New Math. Nat. Comput. 2008, 4, 1–12. [CrossRef] Garg, H.; Arora, R. Distance and similarity measures for dual hesitant fuzzy soft sets and their applications in multi criteria decision-making problem. Int. J. Uncertain. Quantif. 2017, 7, 229–248. [CrossRef] Kharal, A. Distance and similarity measures for soft sets. New Math. Nat. Comput. 2010, 6, 321–334. [CrossRef] Jiang, Y.; Tang, Y.; Liu, H.; Chen, Z. Entropy on intuitionistic fuzzy soft sets and on interval-valued fuzzy soft sets. Inf. Sci. 2013, 240, 95–114. [CrossRef] Garg, H.; Agarwal, N.; Tripathi, A. Fuzzy number intuitionistic fuzzy soft sets and its properties. J. Fuzzy Set Valued Anal. 2016, 2016, 196–213. [CrossRef] Arora, R.; Garg, H. Prioritized averaging/geometric aggregation operators under the intuitionistic fuzzy soft set environment. Sci. Iran. E 2018, 25, 466–482. [CrossRef] Arora, R.; Garg, H. Robust aggregation operators for multi-criteria decision making with intuitionistic fuzzy soft set environment. Sci. Iran. E 2018, 25, 931–942. [CrossRef] Garg, H.; Arora, R. A nonlinear-programming methodology for multi-attribute decision-making problem with interval-valued intuitionistic fuzzy soft sets information. Appl. Intell. 2017, 1–16. [CrossRef] Garg, H.; Arora, R. Bonferroni mean aggregation operators under intuitionistic fuzzy soft set environment and their applications to decision-making. J. Oper. Res. Soc. 2018, 1–14. [CrossRef] Xu, W.; Ma, J.; Wang, S.; Hao, G. Vague soft sets and their properties. Comput. Math. Appl. 2010, 59, 787–794. [CrossRef] Chen, S.M. Measures of similarity between vague sets. Fuzzy Sets Syst. 1995, 74, 217–223. [CrossRef] Wang, C.; Qu, A. Entropy, similarity measure and distance measure of vague soft sets and their relations. Inf. Sci. 2013, 244, 92–106. [CrossRef] Selvachandran, G.; Maji, P.; Faisal, R.Q.; Salleh, A.R. Distance and distance induced intuitionistic entropy of generalized intuitionistic fuzzy soft sets. Appl. Intell. 2017, 1–16. [CrossRef] Ramot, D.; Milo, R.; Fiedman, M.; Kandel, A. Complex fuzzy sets. IEEE Trans. Fuzzy Syst. 2002, 10, 171–186. [CrossRef] Ramot, D.; Friedman, M.; Langholz, G.; Kandel, A. Complex fuzzy logic. IEEE Trans. Fuzzy Syst. 2003, 11, 450–461. [CrossRef] Greenfield, S.; Chiclana, F.; Dick, S. Interval-Valued Complex Fuzzy Logic. In Proceedings of the IEEE International Conference on Fuzzy Systems (FUZZ), Vancouver, BC, Canada, 24–29 July 2016; pp. 1–6. [CrossRef] Yazdanbakhsh, O.; Dick, S. A systematic review of complex fuzzy sets and logic. Fuzzy Sets Syst. 2018, 338, 1–22. [CrossRef] Alkouri, A.; Salleh, A. Complex Intuitionistic Fuzzy Sets. In Proceedings of the 2nd International Conference on Fundamental and Applied Sciences, Kuala Lumpur, Malaysia, 12–14 June 2012; Volume 1482, pp. 464–470. Alkouri, A.U.M.; Salleh, A.R. Complex Atanassov’s intuitionistic fuzzy relation. Abstr. Appl. Anal. 2013, 2013, 287382. [CrossRef] Rani, D.; Garg, H. Distance measures between the complex intuitionistic fuzzy sets and its applications to the decision-making process. Int. J. Uncertain. Quantif. 2017, 7, 423–439. [CrossRef] Kumar, T.; Bajaj, R.K. On complex intuitionistic fuzzy soft sets with distance measures and entropies. J. Math. 2014, 2014, 972198. [CrossRef] Selvachandran, G.; Majib, P.; Abed, I.E.; Salleh, A.R. Complex vague soft sets and its distance measures. J. Intell. Fuzzy Syst. 2016, 31, 55–68. [CrossRef] Selvachandran, G.; Maji, P.K.; Abed, I.E.; Salleh, A.R. Relations between complex vague soft sets. Appl. Soft Comput. 2016, 47, 438–448. [CrossRef]

Entropy 2018, 20, 403

56. 57. 58. 59. 60. 61. 62.

63. 64. 65. 66. 67.

68.

19 of 19

Selvachandran, G.; Garg, H.; Alaroud, M.H.S.; Salleh, A.R. Similarity measure of complex vague soft sets and its application to pattern recognition. Int. J. Fuzzy Syst. 2018, 1–14. [CrossRef] Singh, P.K. Complex vague set based concept lattice. Chaos Solitons Fractals 2017, 96, 145–153. [CrossRef] Hu, D.; Hong, Z.; Wang, Y. A new approach to entropy and similarity measure of vague soft sets. Sci. World J. 2014, 2014, 610125. [CrossRef] [PubMed] Arora, R.; Garg, H. A robust correlation coefficient measure of dual hesitant fuzzy soft sets and their application in decision making. Eng. Appl. Artif. Intell. 2018, 72, 80–92. [CrossRef] Qiu, D.; Lu, C.; Zhang, W.; Lan, Y. Algebraic properties and topological properties of the quotient space of fuzzy numbers based on mares equivalence relation. Fuzzy Sets Syst. 2014, 245, 63–82. [CrossRef] Garg, H. Generalised Pythagorean fuzzy geometric interactive aggregation operators using Einstein operations and their application to decision making. J. Exp. Theor. Artif. Intell. 2018. [CrossRef] Garg, H.; Arora, R. Novel scaled prioritized intuitionistic fuzzy soft interaction averaging aggregation operators and their application to multi criteria decision making. Eng. Appl. Artif. Intell. 2018, 71, 100–112. [CrossRef] Qiu, D.; Zhang, W. Symmetric fuzzy numbers and additive equivalence of fuzzy numbers. Soft Comput. 2013, 17, 1471–1477. [CrossRef] Garg, H.; Kumar, K. Distance measures for connection number sets based on set pair analysis and its applications to decision making process. Appl. Intell. 2018, 1–14. [CrossRef] Qiu, D.; Zhang, W.; Lu, C. On fuzzy differential equations in the quotient space of fuzzy numbers. Fuzzy Sets Syst. 2016, 295, 72–98. [CrossRef] Garg, H.; Kumar, K. An advanced study on the similarity measures of intuitionistic fuzzy sets based on the set pair analysis theory and their application in decision making. Soft Comput. 2018. [CrossRef] Garg, H.; Nancy. Linguistic single-valued neutrosophic prioritized aggregation operators and their applications to multiple-attribute group decision-making. J. Ambient Intell. Hum. Comput. 2018, 1–23. [CrossRef] Garg, H.; Kumar, K. Some aggregation operators for Linguistic intuitionistic fuzzy set and its application to group decision-making process using the set pair analysis. Arab. J. Sci. Eng. 2018, 43, 3213–3227. [CrossRef] © 2018 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).