主要参考了Google test(gtest)和知乎上的一篇文章qtest: 一个单元测试库的从头实现以及作者提供的代码,尤其是宏的部分。一直不喜欢也没有学明白宏的各种用法,但是实现这种风格的测试框架也绕不开宏。 在其基础上进行了整理和重构,并且扩展和完善了一些细节的功能。

本文作为 C++ 学习中的小练习,若有疏漏,欢迎指正。

MTest 介绍

MTest 是一个 header-only 的简易框架,只包含两个头文件:

  • mtest.hpp 负责 MTest 和 MTest::MTestMessage 两个类的实现,不含有任何的宏。
  • mtest_macro.hpp 负责对外提供相应的宏,可以在编译时使用-DUNUSE_MTEST关闭所有的宏,避免与 gtest 产生冲突。

测试文件需要 include 这两个文件,include顺序无所谓,也可以直接合并成一个文件,但是我个人的喜好是对宏敬而远之,因此单独仍在一个头文件中。主要功能实现在mtest.hpp,其中不含有任何的宏。

为什么重复造轮子?

  • 我希望将 MTest 作为 gtest 的简易替代,尤其是它具有 header-only 的特点:不需要编译和链接相应的库,使用非常轻量,实现细节完全透明。mtest.hpp只有五百多行代码,并且代码可读性较高。
  • 实现 MTest 也是学习提升的机会:
    • 可以学习 gtest 的使用,并且利用简洁的语法完成 gtest 的一个小子集的功能。
    • 可以发现语法上的盲点,例如 mtest 利用了全局静态变量在 main 函数之前初始化的特点,将测试函数自动注册,但是这里根据 clang-tidy 的语法检查,需要保证注册时不会抛出任何异常。(std::string构造可能抛异常,因此需要避免使用)
    • 在 filter 的实现中,字符串匹配的判断基于动态规划,测试中发现网上参考代码的小 bug,进行了修正和完善。
    • 在保证功能正确的前提下,不断打磨,写出更加干净漂亮,可读性高的代码。

完成内容

已经完成的部分:

  • 最基本的EXPECT_XX宏和ASSERT_XX宏;
  • TEST宏;(没有支持 gtest 的TEST_F宏以及其它高级用法)
  • 支持使用 filter 对测试进行过滤筛选,只能使用c*.1?.2这种简易的 filter,并且只支持一个 filter;
  • 输出格式和内容基本与 gtest 相同;
  • 支持如下的命令行参数:
    • --mtest_filter=XXX,设置 filter;
    • --mtest_list_tests,列出(满足当前 filter 的)所有测试,不执行;
    • --mtest_use_color,开启彩色输出模式;
    • --mtest_brief,开启简洁输出模式。

可以继续完善的部分:

  • TEST_F宏的实现;
  • 当前的 filter 支持比较简单,可以进一步实现与 gtest 相同的 filter 规则;
  • 使用模板元编程,进一步丰富代码的功能。

MTest模仿gtest,使用如下两种方式提供main函数。

第一种方式是直接在测试文件的合适位置添加宏MTEST_MAIN(注意避免重复定义main函数),这个宏会自动展开为

1
2
3
4
int main(int argc, char *argv[]) {
MTest::InitMTest(argc, argv, __FILE__);
return MTest::RunAllTests();
}

当然在定义了UNUSE_MTEST时,MTEST_MAIN宏和其它宏一样都会定义为空,不会产生冲突。

第二种方式是模仿gtest_main的,创建如下文件并编译为一个静态库,在编译测试文件时链接到一起即可

mtest_main.cpp
1
2
3
4
#include "mtest.hpp"
#include "mtest_macro.hpp"

MTEST_MAIN

MTest切换gtest

MTest 在实现中尽可能地保持了与 gtest 子集的兼容性:所有 MTest 已经实现的宏和功能都保持了与 gtest 相同的语法,并且可以在不改动测试源文件的前提下,直接从 MTest 切换为 gtest。

从 MTest 切换回 gtest 需要使用编译选项:

  • -DUNUSE_MTEST关闭 MTest 提供的宏,它们都定义在mtest_macro.hpp中;
  • 由于测试源文件没有包含 gtest 所需的头文件,因此需要使用-include编译选项导入头文件,例如
1
-I../../external/googletest -include gtest/gtest.h
  • 由于测试源文件中没有 main 函数,MTest 使用宏的方式去生成固定的 main 函数,因此改成 gtest 时必须链接gtestgtest_main这两个库,例如
1
-L../../external/googletest/lib -lgtest -lgtest_main

这里编译选项中的路径由 gtest 实际的位置决定。

实例

实例一

这里我们使用 gtest 提供的 sample1 进行展示,略去了文件中的注释内容。sample1 包含两个函数:一个是阶乘,一个是素数判断。

sample1.cc
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
int Factorial(int n) {
int result = 1;
for (int i = 1; i <= n; i++) {
result *= i;
}

return result;
}

// Returns true if and only if n is a prime number.
bool IsPrime(int n) {
if (n <= 1) return false;
if (n % 2 == 0) return n == 2;

for (int i = 3;; i += 2) {
if (i > n / i) break;
if (n % i == 0) return false;
}
return true;
}

测试文件如下,分别对两个函数进行测试,即两个测试组:FactorialTestIsPrimeTest,各自包含三个测试。

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
// include ...

namespace {

// Tests factorial of negative numbers.
TEST(FactorialTest, Negative) {
EXPECT_EQ(1, Factorial(-5));
EXPECT_EQ(1, Factorial(-1));
EXPECT_GT(Factorial(-10), 0);
}

// Tests factorial of 0.
TEST(FactorialTest, Zero) { EXPECT_EQ(1, Factorial(0)); }

// Tests factorial of positive numbers.
TEST(FactorialTest, Positive) {
EXPECT_EQ(1, Factorial(1));
EXPECT_EQ(2, Factorial(2));
EXPECT_EQ(6, Factorial(3));
EXPECT_EQ(40320, Factorial(8));
}

// Tests negative input.
TEST(IsPrimeTest, Negative) {
// This test belongs to the IsPrimeTest test case.

EXPECT_FALSE(IsPrime(-1));
EXPECT_FALSE(IsPrime(-2));
EXPECT_FALSE(IsPrime(INT_MIN));
}

// Tests some trivial cases.
TEST(IsPrimeTest, Trivial) {
EXPECT_FALSE(IsPrime(0));
EXPECT_FALSE(IsPrime(1));
EXPECT_TRUE(IsPrime(2));
EXPECT_TRUE(IsPrime(3));
}

// Tests positive input.
TEST(IsPrimeTest, Positive) {
EXPECT_FALSE(IsPrime(4));
EXPECT_TRUE(IsPrime(5));
EXPECT_FALSE(IsPrime(6));
EXPECT_TRUE(IsPrime(23));
}
} // namespace

MTEST_MAIN

编译运行的结果如下图,这里 gtest 没有使用彩色输出,绿色是 powershell 自带的,MTest 多了一个 logo,输出内容和格式基本相同。

实例二

由于 sample1 只有正确的测试算例,我们再给出一个含有错误测试的例子,测试文件如下

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
#include "mtest.hpp"
#include "mtest_macro.hpp"

namespace {

TEST(Equal, 1) { EXPECT_EQ(1, 1) << "is 1==1 ?"; }

TEST(Equal, 2) { EXPECT_EQ(2, 3) << "is 2==3 ?"; }

TEST(NotEqual, 1) { EXPECT_NE(2, 3) << "is 2!=3 ?"; }

TEST(IsTrue, 1) { EXPECT_TRUE(2 < 3); }

TEST(isTrue, 2) { EXPECT_TRUE(2 > 4); }

} // namespace

MTEST_MAIN

分别使用 MTest 和 gtest 进行编译,得到结果如下,这里 MTest 开启了彩色输出模式--mtest_use_color

我们再对两者都开启简洁输出模式,分别使用--mtest_brief--gtest_brief选项,只显示未通过的测试信息。

我们对两者都使用过滤器,分别使用--mtest_filter=*.1--gtest_filter=*.1选项,过滤之后的测试全部通过。

源代码

包括 mtest.hppmtest_macro.hpp 两个文件。

mtest.hpp
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
#ifndef MTEST_H_
#define MTEST_H_

#include <cstdio>
#include <cstdlib>
#include <cstring>
#include <ctime>
#include <iostream>
#include <map>
#include <string>
#include <vector>

// MTest
class MTest {
public:
using FuncType = void(int *);

// test item
struct TestItem {
bool match_filter{false};
bool success{true}; // test result
FuncType *f;
const char *itemname;
const char *fullname;

TestItem(FuncType *arg_f, const char *arg_itemname,
const char *arg_fullname)
: f(arg_f), itemname(arg_itemname), fullname(arg_fullname) {}
};

// test set
using TestSet = std::vector<TestItem>;

class MTestMessage {
public:
MTestMessage(bool always_output, bool important_output)
: m_always_output(always_output),
m_important_output(important_output) {}

template <typename T>
MTestMessage &operator<<(T info) {
if (m_important_output || !brief_output()) {
if (m_always_output) {
std::cout << info;
return *this;
}

if (!expect_result()) {
std::cout << info << '\n';
return *this;
}
}
return *this;
}

template <typename T>
static void evaluate_if_required(const char *str, T value) {
if (std::to_string(value) != str) {
std::cout << " Which is: " << value << '\n';
}
}

// record the last EXPECT_XX result.
static bool &expect_result() noexcept {
static bool bool_expect_result{true};
return bool_expect_result;
}

// True: only print test failures.
// False: print more information.(default)
static bool &brief_output() noexcept {
static bool bool_expect_result{false};
return bool_expect_result;
}

private:
const bool m_always_output; // output even succeed last time.
const bool m_important_output; // output even in brief mode.
};

static MTestMessage &msg() {
static MTestMessage msg(true, false);
return msg;
}

static MTestMessage &msg_even_brief() {
static MTestMessage msg(true, true);
return msg;
}

static MTestMessage &msg_when_fail_even_brief() {
static MTestMessage msg(false, true);
return msg;
}

static void init_mtest(int argc, char *argv[], const char *main_file) {
get_instance().set_main_file(main_file);
get_instance().set_from_argv(argc, argv);
}

static void init_mtest(int argc, char *argv[]) {
get_instance().set_from_argv(argc, argv);
}

static void init_mtest(const char *main_file) {
get_instance().set_main_file(main_file);
}

// do nothing
static void init_mtest() {}

static int add_test_item(const char *test_set_name,
const char *test_itemname, FuncType *f,
const char *test_fullname) noexcept {
get_instance().set_test_item(test_set_name, test_itemname, f,
test_fullname);
return 0;
}

static void set_filter(std::string filter, bool force_modify) {
if (force_modify || get_instance().m_filter.empty())
get_instance().set_filter_force(filter);
}

// run all tests.
// return the number of failed test items.
static int run_all_tests() { return get_instance().run(); }

private:
static MTest &get_instance() noexcept {
static MTest entity;
return entity;
}

// run all tests.
int run() {
set_filter("*", false);
set_matched_count();

show_note_main();
show_note_filter();

show_start();

clock_t start_time_all{0};
clock_t start_time_test_set{0};
clock_t start_time_test_item{0};
clock_t end_time_all{0};
clock_t end_time_test_set{0};
clock_t end_time_test_item{0};

start_time_all = clock();
// foreach test set
for (auto it = m_test_sets.begin(); it != m_test_sets.end(); it++) {
std::string test_set_name = it->first;
TestSet &test_set = it->second;

int count = 0;
for (size_t i = 0; i < test_set.size(); i++) {
if (test_set[i].match_filter) count++;
}
if (count == 0) continue;

info_one() << make_proper_str(count, "test", false) << " from "
<< test_set_name << '\n';

start_time_test_set = clock();
// foreach test item
for (size_t i = 0; i < test_set.size(); i++) {
if (!test_set[i].match_filter) continue;

const char *fullname = test_set[i].fullname;
info_run() << fullname << '\n';

int tmp_fail_count = 0;
start_time_test_item = clock();
test_set[i].f(&tmp_fail_count); // call f
end_time_test_item = clock();
test_set[i].success = (tmp_fail_count == 0);

if (test_set[i].success) {
info_ok() << fullname << " ("
<< end_time_test_item - start_time_test_item
<< " ms)\n";
}
else {
info_failed() << fullname << " ("
<< end_time_test_item - start_time_test_item
<< " ms)\n";

++m_test_fail_count;
}
} // foreach test item
end_time_test_set = clock();

info_one() << make_proper_str(count, "test", false) << " from "
<< test_set_name << " ("
<< end_time_test_set - start_time_test_set
<< " ms total)\n\n";
} // foreach test set
end_time_all = clock();
m_cost_time_all = end_time_all - start_time_all;

show_result();
return m_test_fail_count;
}

void list_and_exit() {
set_filter("*", false);
set_matched_count();

show_note_main();

for (auto it = m_test_sets.begin(); it != m_test_sets.end(); it++) {
msg_even_brief() << it->first << ".\n";

const TestSet &test_set = it->second;
for (size_t i = 0; i < test_set.size(); i++) {
if (test_set[i].match_filter)
msg_even_brief() << " " << test_set[i].itemname << '\n';
}
}

exit(0);
}

void help_and_exit() const {
show_note_main();
show_logo();

msg_even_brief() << "[USAGE]\n";
msg_even_brief()
<< "This program contains tests written using MTest.\n"
<< "MTest is a gtest-style simple test framework by "
"fenglielie@qq.com.\n"
<< "You can use the following command line flags to control "
"its behavior:\n\n";
msg_even_brief()
<< "1. --mtest_filter=POSITIVE_PATTERN\n"
<< " Run only the tests whose name matches the positive "
"pattern.\n"
<< " '?' matches any single character; '*' matches any "
"substring(or empty).\n";
msg_even_brief()
<< "2. --mtest_list_tests\n"
<< " List the names of all tests without running them.\n"
<< "The name of TEST(Foo, Bar) is \"Foo.Bar\".\n";
msg_even_brief() << "3. --mtest_disable_color\n"
<< " Disable colorful output.\n";
msg_even_brief() << "4. --mtest_brief\n"
<< " Enable brief output.\n";

msg_even_brief() << '\n';
exit(0);
}

void show_note_main() const {
if (!m_main_file.empty() && m_main_file != "main.cpp") {
msg_even_brief()
<< "Note: Running main() from " << m_main_file << '\n';
}
}

void show_note_filter() const {
if (m_filter != "*")
msg_even_brief() << "Note: MTest filter = " << m_filter << '\n';
}

void show_start() const {
show_logo();

info_two() << "Running "
<< make_proper_str(m_matched_count_items, "test", false)
<< " from "
<< make_proper_str(m_matched_count_sets, "test suite", false)
<< ".\n";
info_one() << "Global test environment set-up.\n";
}

void show_result() const {
info_one() << "Global test environment tear-down\n";
info_two() << make_proper_str(m_matched_count_items, "test", false)
<< " from "
<< make_proper_str(m_matched_count_sets, "test suite", false)
<< " ran. (" << m_cost_time_all << " ms total)\n";

info_passed() << make_proper_str(
m_matched_count_items - m_test_fail_count, "test", false)
<< ".\n";

// list all failed tests
if (m_test_fail_count != 0 && !MTestMessage::brief_output()) {
info_failed() << make_proper_str(m_test_fail_count, "test", false)
<< ", listed below:\n";

for (auto it = m_test_sets.begin(); it != m_test_sets.end(); it++) {
const TestSet &test_set = it->second;
for (size_t i = 0; i < test_set.size(); i++) {
if (!test_set[i].success) {
info_failed() << test_set[i].fullname << '\n';
}
}
}
if (m_use_color) {
msg_even_brief()
<< "\n " << m_color_red
<< make_proper_str(m_test_fail_count, "FAILED TEST", true)
<< m_color_end << '\n';
}
else {
msg_even_brief()
<< "\n "
<< make_proper_str(m_test_fail_count, "FAILED TEST", true)
<< '\n';
}
}
}

void show_logo() const {
if (m_use_color) {
msg() << m_color_green << "\
__ __ _____ _____ ____ _____ \n\
| \\/ |_ _| ____/ ___|_ _|\n\
| |\\/| | | | | _| \\___ \\ | |\n\
| | | | | | | |___ ___) || |\n\
|_| |_| |_| |_____|____/ |_|\n"
<< m_color_end << '\n';
}
else {
msg() << "\
__ __ _____ _____ ____ _____ \n\
| \\/ |_ _| ____/ ___|_ _|\n\
| |\\/| | | | | _| \\___ \\ | |\n\
| | | | | | | |___ ___) || |\n\
|_| |_| |_| |_____|____/ |_|\n"
<< '\n';
}
}

MTestMessage &info_passed() const {
if (m_use_color) {
msg_even_brief() << m_color_green << "[ PASSED ] " << m_color_end;
}
else { msg_even_brief() << "[ PASSED ] "; }
return msg_even_brief();
}

MTestMessage &info_failed() const {
if (m_use_color) {
msg_even_brief() << m_color_red << "[ FAILED ] " << m_color_end;
}
else { msg_even_brief() << "[ FAILED ] "; }
return msg_even_brief();
}

MTestMessage &info_ok() const {
if (m_use_color) {
msg() << m_color_green << "[ OK ] " << m_color_end;
}
else { msg() << "[ OK ] "; }
return msg();
}

MTestMessage &info_two() const {
if (m_use_color) {
msg_even_brief() << m_color_green << "[==========] " << m_color_end;
}
else { msg_even_brief() << "[==========] "; }
return msg_even_brief();
}

MTestMessage &info_one() const {
if (m_use_color) {
msg() << m_color_green << "[----------] " << m_color_end;
}
else { msg() << "[----------] "; }
return msg();
}

MTestMessage &info_run() const {
if (m_use_color) {
msg() << m_color_green << "[ RUN ] " << m_color_end;
}
else { msg() << "[ RUN ] "; }
return msg();
}

void set_test_item(const char *test_set_name, const char *test_itemname,
FuncType *f, const char *test_fullname) noexcept {
try {
TestItem item(f, test_itemname, test_fullname);
m_test_sets[test_set_name].emplace_back(item);
}
catch (...) {
std::cerr << "Failed to add test item: " << test_fullname << '\n';
exit(1);
}
}

void set_filter_force(std::string filter) { m_filter = filter; }

void set_main_file(const char *main_file) { m_main_file = main_file; }

void set_from_argv(int argc, char *argv[]) {
if (argc == 1) return;

const std::string filter_prefix = "--mtest_filter=";
const std::string list_tests_option = "--mtest_list_tests";
const std::string disable_color_option = "--mtest_disable_color";
const std::string help_option = "--help";
const std::string brief_option = "--mtest_brief";

std::string arg_str;
bool is_filter_override = false;

for (int i = 1; i < argc; i++) {
arg_str = argv[i];

if (starts_with(arg_str, filter_prefix)) { // filter
if (!m_filter.empty()) is_filter_override = true;
arg_str.erase(0, filter_prefix.length());

set_filter(arg_str, true); // force update
set_matched_count();
}
else if (arg_str == disable_color_option)
m_use_color = false; // default: false
else if (arg_str == brief_option)
MTestMessage::brief_output() = true; // default: false
else if (arg_str == list_tests_option)
list_and_exit();
else if (arg_str == help_option)
help_and_exit();
else {
msg_even_brief() << "Note: Unknown flag(" << arg_str
<< ") will be ignored.\n";
}
}

if (is_filter_override)
msg_even_brief()
<< "Note: MTest filter will be override by the last one.";
}

// update matched_count_items and matched_count_sets.
// update test_item's match_filter.
void set_matched_count() {
m_matched_count_items = 0;
m_matched_count_sets = 0;

for (auto it = m_test_sets.begin(); it != m_test_sets.end(); it++) {
TestSet &test_set = it->second;

int count = 0;
for (size_t i = 0; i < test_set.size(); i++) {
std::string fullname = test_set[i].fullname;
if (str_match(fullname, m_filter)) {
++count;
test_set[i].match_filter = true;
++m_matched_count_items;
}
else
test_set[i].match_filter = false;
}

if (count > 0) ++m_matched_count_sets;
}
}

static bool starts_with(const std::string &str, const std::string &prefix) {
return prefix.size() <= str.size()
&& std::equal(prefix.cbegin(), prefix.cend(), str.cbegin());
}

static std::string make_proper_str(int num, const std::string &str,
bool uppercase) {
std::string res;
if (num > 1) {
if (uppercase)
res = std::to_string(num) + " " + str + "S";
else
res = std::to_string(num) + " " + str + "s";
}
else { res = std::to_string(num) + " " + str; }
return res;
}

// string match (DP):
// '?' matches any single character; '*' matches any substring(or empty).
static bool str_match(std::string str, std::string pattern) {
if (pattern == "*") return true;

const size_t m = str.size();
const size_t n = pattern.size();
std::vector<bool> prev(m + 1, false);
std::vector<bool> curr(m + 1, false);

prev[0] = true;

for (size_t i = 1; i <= n; i++) {
bool flag = true;
for (size_t ii = 0; ii < i; ii++) {
if (pattern[ii] != '*') {
flag = false;
break;
}
}
curr[0] = flag;

for (size_t j = 1; j <= m; j++) {
if (pattern[i - 1] == '*') {
curr[j] = (curr[j - 1] || prev[j]);
}
else if ((pattern[i - 1] == '?')
|| (str[j - 1] == pattern[i - 1])) {
curr[j] = prev[j - 1];
}
else { curr[j] = false; }
}
prev = curr;
}

return prev[m];
}

// private data
//----------------------------------------------------------------------------//

const char *m_color_red = "\x1b[91m";
const char *m_color_green = "\x1b[92m";
const char *m_color_end = "\x1b[0m";

std::map<const char *, TestSet> m_test_sets;
int m_test_fail_count{0}; // Number of failed test items.
clock_t m_cost_time_all{0};

std::string m_filter;
std::string m_main_file; // The file name of main().
bool m_use_color{true}; // Use colored output.

int m_matched_count_items{0}; // Number of test items matched the filter.
int m_matched_count_sets{0}; // Number of test sets matched the filter.
};

#endif // MTEST_H_
mtest_macro.hpp
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
#ifndef MTEST_MACRO_H_
#define MTEST_MACRO_H_

#ifndef UNUSE_MTEST

// EXPECT
#define MTEST_EXPECT(x, y, cond) \
if (!((x)cond(y))) { \
MTest::MTestMessage::expect_result() = false; \
MTest::msg_even_brief() \
<< __FILE__ << ":" << __LINE__ << ": Failure\n"; \
if (strcmp(#cond, "==") == 0) { \
MTest::msg_even_brief() << "Expected equality of these values:\n" \
<< " " << #x << '\n'; \
MTest::MTestMessage::evaluate_if_required(#x, x); \
MTest::msg_even_brief() << " " << #y << '\n'; \
MTest::MTestMessage::evaluate_if_required(#y, y); \
} \
else { \
MTest::msg_even_brief() \
<< "Expected: (" << #x << ") " << #cond << " (" << #y \
<< "), actual: " << std::to_string(x) << " vs " \
<< std::to_string(y) << '\n'; \
} \
*tmp_fail_count = *tmp_fail_count + 1; \
} \
else { MTest::MTestMessage::expect_result() = true; } \
MTest::msg_when_fail_even_brief()

// EXPECT_XX
#define EXPECT_EQ(x, y) MTEST_EXPECT(x, y, ==)
#define EXPECT_NE(x, y) MTEST_EXPECT(x, y, !=)
#define EXPECT_LT(x, y) MTEST_EXPECT(x, y, <)
#define EXPECT_LE(x, y) MTEST_EXPECT(x, y, <=)
#define EXPECT_GT(x, y) MTEST_EXPECT(x, y, >)
#define EXPECT_GE(x, y) MTEST_EXPECT(x, y, >=)

// EXPECT double almost equal
#define EXPECT_NEAR(x, y, precision) \
if (std::abs((x) - (y)) > (precision)) { \
MTest::MTestMessage::expect_result() = false; \
MTest::msg_even_brief() \
<< __FILE__ << ":" << __LINE__ << ": Failure\n"; \
MTest::msg_even_brief() \
<< "Expected: (" << #x << ") " \
<< " ~ " \
<< " (" << #y << "), actual: " << std::to_string(x) << " vs " \
<< std::to_string(y) << "(" << precision << ")\n"; \
*tmp_fail_count = *tmp_fail_count + 1; \
} \
else { MTest::MTestMessage::expect_result() = true; } \
MTest::msg_when_fail_even_brief()

// EXPECT bool
#define EXPECT_TRUE(x) \
if (!((x))) { \
MTest::MTestMessage::expect_result() = false; \
MTest::msg_even_brief() \
<< __FILE__ << ":" << __LINE__ << ": Failure\n" \
<< "Value of: " << #x << '\n' \
<< " Actual: false\n" \
<< "Expected: true\n"; \
*tmp_fail_count = *tmp_fail_count + 1; \
} \
else { MTest::MTestMessage::expect_result() = true; } \
MTest::msg_when_fail_even_brief()

#define EXPECT_FALSE(x) \
if (((x))) { \
MTest::MTestMessage::expect_result() = false; \
MTest::msg_even_brief() \
<< __FILE__ << ":" << __LINE__ << ": Failure\n" \
<< "Value of: " << #x << '\n' \
<< " Actual: true\n" \
<< "Expected: false\n"; \
*tmp_fail_count = *tmp_fail_count + 1; \
} \
else { MTest::MTestMessage::expect_result() = true; } \
MTest::msg_when_fail_even_brief()

// ASSERT
#define MTEST_ASSERT(x, y, cond) \
if (!((x)cond(y))) { \
MTest::msg_even_brief() \
<< __FILE__ << ":" << __LINE__ << ": Failure\n"; \
if (strcmp(#cond, "==") == 0) { \
MTest::msg_even_brief() << "Expected equality of these values:\n" \
<< " " << #x << '\n'; \
MTest::MTestMessage::evaluate_if_required(#x, x); \
MTest::msg_even_brief() << " " << #y << '\n'; \
MTest::MTestMessage::evaluate_if_required(#y, y); \
} \
else { \
MTest::msg_even_brief() \
<< "Expected: (" << #x << ") " << #cond << " (" << #y \
<< "), actual: " << std::to_string(x) << " vs " \
<< std::to_string(y) << '\n'; \
} \
*tmp_fail_count = *tmp_fail_count + 1; \
return; \
}

// ASSERT_XX
#define ASSERT_EQ(x, y) MTEST_ASSERT(x, y, ==)
#define ASSERT_NE(x, y) MTEST_ASSERT(x, y, !=)
#define ASSERT_LT(x, y) MTEST_ASSERT(x, y, <)
#define ASSERT_LE(x, y) MTEST_ASSERT(x, y, <=)
#define ASSERT_GT(x, y) MTEST_ASSERT(x, y, >)
#define ASSERT_GE(x, y) MTEST_ASSERT(x, y, >=)

// ASSERT bool
#define ASSERT_TRUE(x) \
if (!((x))) { \
MTest::msg_even_brief() \
<< __FILE__ << ":" << __LINE__ << ": Failure\n" \
<< "Value of: " << #x << '\n' \
<< " Actual: false\n" \
<< "Expected: true\n"; \
*tmp_fail_count = *tmp_fail_count + 1; \
return; \
}

#define ASSERT_FALSE(x) \
if (((x))) { \
MTest::msg_even_brief() \
<< __FILE__ << ":" << __LINE__ << ": Failure\n" \
<< "Value of: " << #x << '\n' \
<< " Actual: true\n" \
<< "Expected: false\n"; \
*tmp_fail_count = *tmp_fail_count + 1; \
return; \
}

// TEST
#define TEST(set, name) \
void mtest_donotuse_func_##set##_##name(int *tmp_fail_count); \
struct MtestMarkClass##set##name { \
private: \
const static inline int m_mtest_donotuse_mark_##set##_##name = \
MTest::add_test_item(#set, #name, \
mtest_donotuse_func_##set##_##name, \
#set "." #name); \
}; \
void mtest_donotuse_func_##set##_##name(int *tmp_fail_count)

#define RUN_ALL_TESTS MTest::run_all_tests

#define TEST_FILTER(filter_str) MTest::set_filter(filter_str)

#define MTEST_MAIN \
int main(int argc, char *argv[]) { \
MTest::init_mtest(argc, argv, __FILE__); \
return MTest::run_all_tests(); \
}

#else
#define MTEST_MAIN
#endif // UNUSE_MTEST

#endif // MTEST_MACRO_H_