Comparative Benchmarking of Methods for Approximate ... - CiteSeerX

6 downloads 287 Views 3MB Size Report
This paper is devoted to benchmark-tests of three different recently ... Comparative benchmarking. 3 ..... 11. http://new.haveland.com/povbench/graph.php.
Comparative Benchmarking of Methods for Approximate Implicitization Elmar Wurm, Jan B. Thomassen, Bert J¨ uttler, Tor Dokken

Abstract. Recently a variety of algorithms for the approximation of parametric curves and surfaces by algebraic representations have been developed. We test three of these methods on several test cases, keeping track of time and memory consumption of the implementations and the quality of the approximation. Additionally we discuss some qualitative aspects of the different methods.

§1. Introduction Mathematical descriptions of curves or surfaces are the basis of computer aided design, engineering and manufacturing. Traditionally, these processes mainly rely on parametric descriptions for the geometric objects involved. The benefits of parametric patches include fast visualization, simple descriptions of milling paths and design via control points. Until recently implicit representations did not receive much attention concerning applications. However, they have some advantages, such as easy position tests for points, the possibility to define solids or fast intersection algorithms. Combining both representations seems to be a good approach, to exploit the advantages of each representation. Consequently, methods for the conversion to and from implicit representation are needed. Several exact methods for the process of finding the implicit representation of parametric curves and surfaces have been investigated [2, 3, 4, 6, 8, 9]. Among these approaches are resultants, Groebner bases and moving curves and surfaces. These exact methods are computationally expensive. Moreover, they may result in algebraic representations of high degree. The idea of approximate implicitization (AI) may help. By computing a low degree algebraic approximation for a given region of interest on a geometric object, the problems caused by exact implicitization can be avoided. XXX xxx and xxx (eds.), pp. 1–4. Copyright 200x by Nashboro Press, Brentwood, TN. ISBN 0-9728482-x-x All rights of reproduction in any form reserved.

c

1

2

E. Wurm et al.

This paper is devoted to benchmark-tests of three different recently developed methods for approximate implicitization. The first two have been investigated and implemented in C by the authors of this paper. For details on these methods see [7, 10] respective [5]. Both methods have been developed in the EU funded project GAIA II. The third method – which is to serve as a reference method – is the ‘implicitize’-procedure from the ‘algcurves’-package of Maple [1]. See the next section for a short survey on the different algorithms. Other methods for generating implicit representations of curves and surfaces have been developed in Computer Graphics. Here we focus our attention on algorithms generating implicit representations by polynomials, since these methods may help to improve the robustness of intersection algorithms in CAD. Our intent is not to find a ‘best method’, but to study the properties, abilities and weaknesses of the different methods. We give a qualitative comparison, which finally states also the remaining problems of the different approaches. In order to obtain our results, we have run the various algorithms on a set of test cases. Their properties, as well as the motivation for the choice of these examples are discussed in section 3. “Hard” criteria for the benchmarks are the computation time, the total of the used memory, and the error of the resulting implicit functions. More details on the methods used for these measurements are given in section 4. The results for this “hard” criteria are presented in section 5. Furthermore we analyze other properties, like stability and scalability, described in section 6. §2. Short Descriptions of the Methods AI by Scattered Data Fitting (Linz). The implicitization method based on scattered data fitting has been developed at the Johannes Kepler University Linz. In a preprocessing step we generate a point cloud from the given parametric representation. The approximate implicitization is then found by fitting a curve or surface simultaneously to the sampled points and auxiliary estimated normals. This leads to system of linear equations for the unknown coefficients. For details see [10]. Input for this method are the degree of the implicit approximation and additional the number of points, sampled in the preprocessing step. AI by Singular Value Decomposition (SINTEF). This method has been developed at SINTEF. A description can be found in [5]. In brief, we substitute the parametric curve or surface into an implicit function of chosen degree and unknown coefficients. This results in a factorization

3

Comparative benchmarking

which uses a certain matrix. The coefficients of the implicit representation are found as the eigenvector corresponding to the smallest singular value of this matrix. The only input parameter is the chosen degree of the implicit function. Maple’s “implicitize” command. This is based on the algorithm described in [1]. Again, the parametric curve or surface is substituted into an implicit function of chosen degree and unknown coefficients. The implicit approximation is found by minimizing the integral of the resulting parametric function over the the given parameter domain with respect to the unknown coefficients. Again the eigenvector associated to the smallest eigenvalue of a certain matrix has to be computed. In order to generate this matrix, several integrations have to be performed. As, by choice, this task can be addressed in a symbolic or numerical way, the performed benchmarks consider this option. The Linz method is partly related to this approach, as the use of a point sampling can be interpreted as a discretization of this method. ‘Partly’, because the Linz algorithm additionally uses estimated normals, in order to reduce the computational complexity even more. §3. Test Cases In this paper, we present only a representative selection of our larger set of test cases. All results which are derived from this selection have been confirmed by the not quoted test cases. Here we focus on surfaces, as they usually impose more problems. Currently, all three prototypes are limited to produce single patch approximations. Consequently the test cases are B´ezier surfaces. Implicit spline approximations, which are of interest for real world applications, will be the subject of future research. Some of the examples are chosen to be of low degree, in order to display the ability to reproduce essential features like singularities and self-intersections in a relatively simple setting. Other examples of higher degrees were used to demonstrate the algorithmic dependencies. Most of the test cases have singular points, as we expected them to be a challenge, especially for the Linz algorithm. Besides, it is possible that curves with singularities will be used in the design process by future generations of CAD-systems, providing they are better controlled. Descriptions of the test surfaces are given in table 1. Test cases marked by † are industrial benchmarks. They appear by courtesy of the CADcompany think3. Figure 1 shows snapshots of the test surfaces. §4. Benchmarking Criteria As mentioned in the introduction, we test the different algorithms on the criteria time consumption, total memory consumption and accuracy. For

4

E. Wurm et al. Surface

Degree

Quartic

(3, 2)

Simplesweep†

(4, 3)

Self-ucurves†

(4, 2)

Surface 6x5

(6, 5)

Description Surface with self-intersection curves and a cusp-like singular point. The bi-degree of the B´ezier representation suggests an exact implicit representation of degree 2 × 3 × 2 = 12, but due to reflection symmetries the correct degree is 4. Self-intersecting. The exact implicit representation has degree 6. The surface is obtained by “sweeping” between a piece of a node and a piece of a parabola. Surface with a closed self-intersection curve. Two points on this curve are cusp-like singularities. Effectively, the degree of the exact implicit representation for this surface is 8. The surface is obtained by blending a curve segment with a node between two non-self-intersecting curves. A selfintersecting curve is moved in space and at the same time bent. The theoretical algebraic degree of this surface is 30.

Tab. 1. Description of test surfaces

(a) Quartic

(b) Simplesweep

(c) Self-u-curves

(d) Surface 6x5

Fig. 1. Test surfaces simplicity we will abbreviate the methods as ’Linz algorithm’, ’SINTEF algorithm’ and ’Maple algorithm’ respectively. The quoted time is an approximation of the processor time used by the implementations, and similarly for the memory. These quantities were measured with standard Linux tools (time and top) or calls to C systemfunctions. In the case of the ’Maple algorithm’ the ‘profile’ procedure from Maple was used. The tests of the Linz and the Maple algorithm were executed with an Intel Pentium IV 1.7 GHz processor. Whereas the SINTEF algorithm was tested on an Intel Pentium III 860 MHz processor. Both systems have the

Comparative benchmarking

5

same operating system and working memory. The resulting times for the Linz and the Maple methods have been multiplied by the scaling factor 1.7 according to cpu benchmark tests (see [11]). In order to quantify the accuracy of the Linz and SINTEF-methods, we used the following procedure. As the implicit surface is computed in order to approximate the given parametric surface, it seems natural to measure the average and maximum Euclidean distance between the two surfaces. Since a compact representation of these values does not exist, we instead consider a set of points on the parametric surface and compute their Euclidean distances with respect to the implicit surface. This is done by applying a Newton-Raphson iteration for each point. As we expect the implicit approximations to be close to the given parametric surfaces, the iteration process should be safe. In order to get a scale independent measurement, we divide the average and maximum distances by the length L of the shortest side of the bounding box of the parameterized curve or surface. In a nutshell, the two values listed for the errors are scaled values of the maximum and average Euclidean distances of a point sampling taken from the parametric surface with respect to the implicit surface. Values less than 10−6 are quoted to be equal zero. §5. Benchmark Results The reason for computing an approximate implicitization instead of an exact implicitization, is to get an algebraic representation of low degree. As the algebraic degrees of some of the test cases are quite low, approximations of lower degree are most of the time not satisfying. The reason for this is that the topological type of the curve or surface may change drastically. Therefore some benchmarks were performed using the exact algebraic degree. In this sense they are ‘exact’ implicitizations. In the case of the SINTEF algorithm, the use of the exact degree implies that the computed solution is the real implicit equation. Since the Linz method computes an approximation for a point cloud it generally remains inexact, even when using the exact degree for the implicitization process. By applying the process of normal vector adjustment, the solution can be made to converge to the exact one, but the convergence is rather slow. The results given in the tables 2 and 3 were computed using the exact algebraic degree. Results are shown in figures 3 and 4. In order to show that our methods produce suitable low degree approximations for surfaces having a high algebraic degree, we have selected surface 6x5. The results for this benchmark are given in table 4. In figures 3 and 4 the degree 8 approximations are shown. From these benchmarks we see that

6

E. Wurm et al. Surface

Degree

Quartic Simplesweep Self-ucurves

4 6 8

Time sec. 5.4 13.0 76.1

Linz Memory Mb. 4.5 8.5 15.2

Error(10−3 ) Av. Max. 1.48 8.66 0.02 2.81 0.21 10.32

SINTEF Time Memory sec. Mb. 0.01 0.9 0.25 1.2 1.37 1.8

Tab. 2. Results for the Linz and SINTEF methods. Surface

Degree

Quartic Simplesweep Self-ucurves

4 6 8

symbolic sec. Mb 328 616 Mb n. A. n. A.

numerical sec. Mb 28 14.5 82 44 596 109

Tab. 3. Results for the Maple method. • all three algorithms are able to compute the exact implicitization, or an approximation, for all the test cases, • all three prototypes can be run on a standard personal computer (memory and time cost), • in the case of low degree approximations the SINTEF method is by far the fastest among the algorithms, and • the time and memory consumption of the Maple method increases drastically when the degree grows (see table 6). The ‘symbolic’-option was tested only for low degree approximations, as the computation for higher degrees can take several hours. Deg. 5 6 7 8 9 10

Time sec. 9.5 14.7 25.0 52.8 91.6 158.7

Linz Memory Error(10−3 ) Mb. Av. Max. 6.3 15.50 101.99 8.6 4.15 215.97 11.7 1.34 23.35 16.0 0.89 28.90 21.3 0.49 13.18 28.0 0.31 11.42

Time sec. 0.2 1.0 3.5 6.9 17 45

SINTEF Memory Error(10−3 ) Mb. Av. Max. 1.3 16.13 313.44 1.8 6.72 143.66 3.5 3.16 84.45 4.0 0.69 21.60 6.3 0.60 24.95 8.7 0.12 19.87

Tab. 4. Results for Surface 6x5 with the Linz and SINTEF algorithms.

§6. Results for Other Criteria In this section we consider the behavior of the algorithms with respect to a set of qualitative criteria, compared to the “hard” criteria time, memory, and error. More precisely, we will discuss stability and scalability. In addition, we provide certain qualitative comparisons of the Linz

Comparative benchmarking

7

and SINTEF algorithm. Here, stability means the question of whether or not the algorithms works for all combinations of input parameters. More precisely, does it need a special representation of the input data, or some additional input parameters (i.e. degree of approximation, number of sampled points). By scalability we mean the following: How does the output from the algorithms vary as the input is varied? There are several ways in which we have varied the input. First, we have considered different degrees of the approximation, based, on the same input surface. Second, for the Linz algorithm, we have tested the effects of varying the number of sampled points. Third, for the SINTEF algorithm, we have compared the results of implicitizing different representations of the same surface, where the different representations vary in the degrees. Stability. Concerning the Linz method, the preprocessing steps (normal estimation, orientation propagation and segmentation) require the choice of several parameters. Currently, we are developing a number of heuristics for adjusting these parameters automatically, but a more rigorous analysis is not available yet. All the tests were run using the same parameters. A restrictive choice concerning the reliability of the normal estimates was a key to obtain suitable normal fields. For all the test cases, no problems have occurred.The stability of the linear system is mainly affected by the chosen degree. If it is higher than the exact algebraic degree, the linear system becomes singular. In this cases a tension term can be used in order to regularize the linear system, leading to a unique solution (see [10]). The SINTEF algorithm works in principle for any choice of parameters. In particular cases (for very high degrees) the linear algebra routines (taken from the ”newmat” C++ library) produced warning messages. In such a case the reliability of the result needs to be assured. The Maple algorithm had no problems for any of the test cases, using the ‘numerical’ option. Due to the long computation times the ‘symbolic’ option, has only been benchmarked for some of the test cases. Scalability. Benchmark series, which demonstrate the influence of the degree of the approximation, are given in tables 4 and 5. Measurements for ’self-ucurve’ using the Maple method are given in table 6. A graphical presentation of the computation times is given in figure 2. If input surface and implicit approximation are of higher degree, the Linz prototype is able to outstrip the SINTEF prototype. This is due to the fact, that the SINTEF algorithm relies on SVD, whereas the Linz method uses only a Cholesky decomposition. For small degrees the SINTEF prototype is faster, as the involved matrix can be generated with much less effort. The column for the error finally shows the increasing accuracy of

8

E. Wurm et al. Degree 2 3 4 5 6 7 8

Time sec. 4.2 4.5 5.5 8.2 13.6 28.0 51.2

Linz Memory Error(10−3 ) Mb. Av. Max. 2.5 57.50 206.26 3.3 31.69 144.97 4.5 19.17 171.65 6.2 6.12 342.82 8.5 1.40 31.62 11.7 0.13 11.22 15.9 0.21 10.32

Time sec. 0.00 0.00 0.01 0.04 0.14 0.53 1.38

SINTEF Memory Error(10−3 ) Mb. Av. Max. 0.9 116.0 482.8 1.0 76.5 343.1 1.0 27.9 114.7 1.2 22.3 114.6 1.3 23.2 44.0 1.5 8.4 38.0 1.8 ∼0 ∼0

Tab. 5. Different degrees for Linz and SINTEF methods (’self-ucurves’) Degree 2 3 4 5 6 7 8

symbolic sec. Mb 1.1 8.6 1200 176.7 n. A. n. A. n. A. n. A. n. A.

numerical sec. Mb 3.6 2.5 14.3 7.3 39.0 14.4 94.4 27.2 189.7 46.5 338.0 74.8 596.3 109

Tab. 6. Different degrees for Maple method applied to ’self-ucurves, no cut’ the approximation with rising degree. Table 7 shows a test series for the Linz method, varying the number of points to be approximated. The last column reflects the error based on 40.000 points. These values show, that the quality of the approximation is not necessarily improved, by using more points. The values in the time column clearly point out the linear relation between the number of sampled points and the computation time. The Linz method relies on the correctness of the field of estimated normals. This determines a lower bound for the number of considered points. Thus, there is a trade–off between stability and the speed of the algorithm. In order to generate sparse yet accurate samples, the parameters used in the preprocessing steps should be chosen interactively by the user. Finally, we have analyzed the influence of the degree of the input surface. For the Linz method this does not make any difference, since it is based on sampled points. The input is immediately translated into a representation in terms of a point cloud. The computation time solely relies on the number of sampled points and the number of coefficients of the approximation. This is affirmed by the similar time measurements in tables 4 and 5. For the SINTEF algorithm, however, there is a dependency. The

9

Comparative benchmarking 600

160 SINTEF Linz

SINTEF Linz Maple

140

500 120 400

Time

100

300

80

60

200

40 100 20 0 2

3

4

5 Degree

6

7

8

0 5

(a) Self-u-curves

6

7

8

9

10

(b) Surface 6x5

Fig. 2. Summaries of the used time. Number of points 625 1.250 2.500 5.000 10.000 20.000 40.000

Time sec. 3.0 4.5 7.3 16.2 24.7 62.7 143.3

Memory Mb 8.4 9.3 10.9 14.1 20.1 33.9 52.9

Error(10−3) Av. Max. 3.37 25.70 3.44 17.65 3.53 19.08 5.83 49.09 5.93 47.87 3.56 20.02 3.56 19.6

Tab. 7. Different number of sampled points for the quartic surface (Linz method) corresponding benchmarks differ. A more careful analysis reveals that the computing time depends linearly on the degrees of the input surface. Qualitative Comparisons. In this section we discuss a number of qualitative comparisons of the Linz and SINTEF methods, and partially the Maple method. • Is the algorithm able to reproduce surfaces exact? For the SINTEF algorithm the answer is ’yes’. Not so for the Linz algorithm, which produces always approximations, due to the point sampling based approach. As a result, singularities may not be reproduced exactly. This problem can partially be resolved by using ‘normal vector adjustment’ (however, the convergence is relatively slow). The Maple function reproduces exact results, in the case of symbolic integration. Using this option the computation gets quite slow. • Is the algorithm able to “push away” singularities or additional (‘phantom’) branches of the surfaces, if this is desired? The answer is ’yes’ for

10

E. Wurm et al.

both Linz and SINTEF, and ‘no’ for the Maple function. For the SINTEF algorithm, instead of choosing the implicit function corresponding to the smallest singular value of the involved matrix (see [5]), one may choose an n-dimensional space of functions corresponding to the n smallest singular values. From this set one may construct an implicit function with desired properties. However, this requires a separate analysis of the different solutions associated with the singular values. Currently, this has not been implemented. In the case of the Linz algorithm, the simultaneous approximation of points and estimated unit normals automatically pushes the unwanted branches away. Note that the second branch of the Simplesweep surface (Figure 3) does not intersect the surface in the region of interest. The Maple function is not able to avoid these unwanted branches, since it only provides one output curve or surface. • Is the algorithm fully automatic? All algorithms need the chosen degree of the implicit representation as input. Hence some analysis of the problem is required prior to performing an implicitization. In the curve case, the degree can be chosen to be the parametric degree, but in the surface case this is not recommended, as simple bicubic patches have algebraic order 18 in general. The Linz algorithm also requires a choice for the number of points to be sampled, which can partly be set by an additional heuristic. Summing up, the answer is ‘maybe’, since all algorithms need some additional input parameters, which have to be provided by the user. In the curve case, the answer is definitely ‘yes’. • Is the algorithm able to find good low degree approximations? The error column of table 4 clearly shows that the approximation quality is rising with increasing degree. For all methods the question of optimal choice of the degree is still open. Clearly a tradeoff between accuracy and speed has to be made. This choice will be analyzed in the future. Furthermore we plan to use piecewise algebraic methods, which yield to real world applications. Here, additional degrees of freedom can be introduced by using higher degrees, but also by increasing the number of patches. • Is it possible to use the algorithms for general procedurally defined curves and surfaces? The answer is ‘yes’ for the Linz method, as it can be used to implicitize any representation by sampling a point cloud. The SINTEF and Maple method are based on parametrically defined curves and surfaces. §7. Conclusions In this paper, we have compared three available techniques for approximate implicitization, using certain benchmarks and a number of qualitative criteria. It is planned to continue this comparison for piecewise

Comparative benchmarking

11

Fig. 3. Results for surfaces - Linz algorithm

Fig. 4. Results for surfaces - SINTEF algorithm algebraic approximations.Due to the local influence of splines, the approximations will be more flexible. As demonstrated by the results, there exists several techniques, which have their advantages and disadvantage. The most powerful technique is currently the SINTEF one, since it needs only few parameters, and it is very fast. As a certain disadvantage, it cannot be used for general input curves and surfaces, and it requires a separate analysis of the eigenvectors to give solutions without additional branches. The Linz algorithm, which requires a number of additional parameters, can be applied to arbitrary input curves and surfaces. We expect that it will perform significantly better in the case of piecewise algebraic approximations. Assuming the same number of coefficients as for single patch approximation, the generation of the linear system will be faster for piecewise approximations, due to the compact support of the basis functions. Summing up, we believe that the Linz and SINTEF methods may serve as the basis for exploring the potential application of approximate algebraic techniques. Still, in order to deal with real industrial problems, we will have to generalize our approach to piecewise algebraic methods. Acknowledgements This research is funded by the European Union through project IST-2002-35512 - GAIA II , entitled ‘Intersection algorithms for geometry based IT-applications using approximate algebraic methods’. We thank our project partner ‘think3’ for providing the data.

12

E. Wurm et al. §8. References

1. R. M. Corless, M. W. Giesbrecht, I. S. Kotsireas, S. M. Watt., Numerical implicitization of parametric hypersurfaces with linear algebra, AISC’2000 Proceedings, Madrid, Spain. LNAI 1930 2. D. Cox, J. Little, D. O’Shea T. W. Sederberg, F. Chen, Ideals, Varieties and Algorithms, Springer Verlag, New York, 1992 & 1997. 3. D. Cox, J. Little, D. O’Shea T. W. Sederberg, F. Chen, Using Algebraic Geometry, Springer Verlag, New York, 1998. 4. D. Cox, R. Goldman, M. Zhang, On the validity of Implicitization by Moving Quadrics for Rational Surfaces with No Base Points, J. Symbolic Computation, 11, 1999. 5. T. Dokken, Approximate Implicitization, Mathematical Methods in CAGD: 2000, Tom Lyche and Larry L. Schumaker (eds), Vanderbilt University Press Nashville, 2001 , 20th March 2001 6. C. M. Hoffmann, Implicit Curves and Surfaces in CAGD, Comp. Graphics and Applics., 13, 79-88, 1993. 7. B. J¨ uttler, A. Felis, Least-square fitting of algebraic spline surfaces, Advances in Computational Mathematics, 17 (1-2): 135-152, 2002. 8. R. R. Patterson, Using duality to implicitize and find cusps and inflection points of B´ezier curves, Computer Aided Geometric Design, Volume 19, Issue 6, June 2002, Pages 433-444. 9. T. W. Sederberg, F. Chen, Implicitization using moving curves and surfaces, Computer Graphics, Proceedings, Annual Conference Series, 29:301–308, 1995. 10. E. Wurm, B. J¨ uttler, Approximate Implicitization via Curve Fitting, Proceedings of the Eurographics/ACM SIGGRAPH symposium on Geometry processing, 2003, Aachen, Germany, Eurographics Association 11. http://new.haveland.com/povbench/graph.php

Elmar Wurm, Bert J¨ uttler Johannes Kepler University Linz, Austria http://www.ag.jku.at

Jan B. Thomassen, Tor Dokken SINTEF Oslo, Norway http://www.math.sintef.no