gcc:CLOCK_REALTIME 未声明

发布于 2025-01-10 14:19:22 字数 1887 浏览 0 评论 0 原文

我试图运行我发现的 C 代码 在这个网站上

#include <stdlib.h>
#include <stdio.h>
#include <time.h>

#define n 2048

double A[n][n];
double B[n][n];
double C[n][n];

int main() {

    //populate the matrices with random values between 0.0 and 1.0
    for (int i = 0; i < n; i++) {
        for (int j = 0; j < n; j++) {

            A[i][j] = (double) rand() / (double) RAND_MAX;
            B[i][j] = (double) rand() / (double) RAND_MAX;
            C[i][j] = 0;
        }
    }

    struct timespec start, end;
    double time_spent;

    //matrix multiplication
    clock_gettime(CLOCK_REALTIME, &start);
    for (int i = 0; i < n; i++) {
        for (int j = 0; j < n; j++) {
            for (int k = 0; k < n; k++) {
                C[i][j] += A[i][k] * B[k][j];
            }
        }
    }
    clock_gettime(CLOCK_REALTIME, &end);
    time_spent = (end.tv_sec - start.tv_sec) + (end.tv_nsec - start.tv_nsec) / 1000000000.0;
    printf("Elapsed time in seconds: %f \n", time_spent);
    return 0;
}

但是当我编译它时,gcc说:

main.c:27:19: error: 'CLOCK_REALTIME' undeclared (first use in this function)
 clock_gettime(CLOCK_REALTIME, &start);
               ^~~~~~~~~~~~~~

我使用了来自gcc-g++ rel="nofollow noreferrer">MinGW本教程

我刚刚从教程页面复制了 C 代码并使用它进行编译

gcc -O3 main.c -o matrix

(我的源文件名为 main.c)。

可能的重要信息:我使用的是 Windows 10。

编辑:编译在 Ubuntu 20.04 上运行得很好(如文章中所述)。不过,你能帮我让它在 Windows 上编译吗?

I was trying to run the C code that I found on this website

#include <stdlib.h>
#include <stdio.h>
#include <time.h>

#define n 2048

double A[n][n];
double B[n][n];
double C[n][n];

int main() {

    //populate the matrices with random values between 0.0 and 1.0
    for (int i = 0; i < n; i++) {
        for (int j = 0; j < n; j++) {

            A[i][j] = (double) rand() / (double) RAND_MAX;
            B[i][j] = (double) rand() / (double) RAND_MAX;
            C[i][j] = 0;
        }
    }

    struct timespec start, end;
    double time_spent;

    //matrix multiplication
    clock_gettime(CLOCK_REALTIME, &start);
    for (int i = 0; i < n; i++) {
        for (int j = 0; j < n; j++) {
            for (int k = 0; k < n; k++) {
                C[i][j] += A[i][k] * B[k][j];
            }
        }
    }
    clock_gettime(CLOCK_REALTIME, &end);
    time_spent = (end.tv_sec - start.tv_sec) + (end.tv_nsec - start.tv_nsec) / 1000000000.0;
    printf("Elapsed time in seconds: %f \n", time_spent);
    return 0;
}

But when I compile it gcc says:

main.c:27:19: error: 'CLOCK_REALTIME' undeclared (first use in this function)
 clock_gettime(CLOCK_REALTIME, &start);
               ^~~~~~~~~~~~~~

I used gcc-g++ from MinGW as described in this tutorial.

I just copied the C code from the tutorial page and compiled it using

gcc -O3 main.c -o matrix

(my source file is named main.c).

Possible important information: I am on Windows 10.

EDIT: Compilation works just fine on Ubuntu 20.04 (as described in the article). However, can you help me get it to compile on Windows?

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(1

风吹过旳痕迹 2025-01-17 14:19:22

这是我正在开发的基准程序的内容:

typedef long long ticks_t;

static ticks_t ticks_per_second;

#if OS_Windows
static ticks_t get_timer_resolution()
{
    LARGE_INTEGER freq;
    if (!QueryPerformanceFrequency(&freq)
            || !freq.QuadPart) {
        printf("Error: cannot get Windows timer resolution.\n");
        exit(123);
    }
    return freq.QuadPart;
}

inline static ticks_t get_ticks()
{
    LARGE_INTEGER ticks;
    if (!QueryPerformanceCounter(&ticks)) {
        printf("Error: cannot get Windows timer count.\n");
        exit(123);
    }
    return ticks.QuadPart;
}
#else       // NOT WINDOWS, should be Linux / Unix
static ticks_t get_timer_resolution()
{
    return 10000000;
}

inline static ticks_t get_ticks()
{
    struct timespec t;
    clock_gettime(CLOCK_REALTIME, &t);
    return (long long)t.tv_sec * 10000000 + (t.tv_nsec + 50) / 100;
}

#endif

[...]

int main() {
    [...]
    ticks_per_second = get_timer_resolution();

然后在我需要计时的前后使用 get_ticks() ,计算差异,并使用 根据需要进行缩放ticks_per_second,例如

    nticks = nticks * 1000000 / ticks_per_second;   // convert to microsecs

Here's what I've got for a benchmark program I'm working on:

typedef long long ticks_t;

static ticks_t ticks_per_second;

#if OS_Windows
static ticks_t get_timer_resolution()
{
    LARGE_INTEGER freq;
    if (!QueryPerformanceFrequency(&freq)
            || !freq.QuadPart) {
        printf("Error: cannot get Windows timer resolution.\n");
        exit(123);
    }
    return freq.QuadPart;
}

inline static ticks_t get_ticks()
{
    LARGE_INTEGER ticks;
    if (!QueryPerformanceCounter(&ticks)) {
        printf("Error: cannot get Windows timer count.\n");
        exit(123);
    }
    return ticks.QuadPart;
}
#else       // NOT WINDOWS, should be Linux / Unix
static ticks_t get_timer_resolution()
{
    return 10000000;
}

inline static ticks_t get_ticks()
{
    struct timespec t;
    clock_gettime(CLOCK_REALTIME, &t);
    return (long long)t.tv_sec * 10000000 + (t.tv_nsec + 50) / 100;
}

#endif

[...]

int main() {
    [...]
    ticks_per_second = get_timer_resolution();

Then use get_ticks() just before and after what I need to time, take the difference, and scale as needed using ticks_per_second, e.g.

    nticks = nticks * 1000000 / ticks_per_second;   // convert to microsecs
~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文