Linux kernel mirror (for testing) git.kernel.org/pub/scm/linux/kernel/git/torvalds/linux.git
kernel os linux
1
fork

Configure Feed

Select the types of activity you want to include in your feed.

locking/atomic: scripts: generate kerneldoc comments

Currently the atomics are documented in Documentation/atomic_t.txt, and
have no kerneldoc comments. There are a sufficient number of gotchas
(e.g. semantics, noinstr-safety) that it would be nice to have comments
to call these out, and it would be nice to have kerneldoc comments such
that these can be collated.

While it's possible to derive the semantics from the code, this can be
painful given the amount of indirection we currently have (e.g. fallback
paths), and it's easy to be mislead by naming, e.g.

* The unconditional void-returning ops *only* have relaxed variants
without a _relaxed suffix, and can easily be mistaken for being fully
ordered.

It would be nice to give these a _relaxed() suffix, but this would
result in significant churn throughout the kernel.

* Our naming of conditional and unconditional+test ops is rather
inconsistent, and it can be difficult to derive the name of an
operation, or to identify where an op is conditional or
unconditional+test.

Some ops are clearly conditional:
- dec_if_positive
- add_unless
- dec_unless_positive
- inc_unless_negative

Some ops are clearly unconditional+test:
- sub_and_test
- dec_and_test
- inc_and_test

However, what exactly those test is not obvious. A _test_zero suffix
might be clearer.

Others could be read ambiguously:
- inc_not_zero // conditional
- add_negative // unconditional+test

It would probably be worth renaming these, e.g. to inc_unless_zero and
add_test_negative.

As a step towards making this more consistent and easier to understand,
this patch adds kerneldoc comments for all generated *atomic*_*()
functions. These are generated from templates, with some common text
shared, making it easy to extend these in future if necessary.

I've tried to make these as consistent and clear as possible, and I've
deliberately ensured:

* All ops have their ordering explicitly mentioned in the short and long
description.

* All test ops have "test" in their short description.

* All ops are described as an expression using their usual C operator.
For example:

andnot: "Atomically updates @v to (@v & ~@i)"
inc: "Atomically updates @v to (@v + 1)"

Which may be clearer to non-naative English speakers, and allows all
the operations to be described in the same style.

* All conditional ops have their condition described as an expression
using the usual C operators. For example:

add_unless: "If (@v != @u), atomically updates @v to (@v + @i)"
cmpxchg: "If (@v == @old), atomically updates @v to @new"

Which may be clearer to non-naative English speakers, and allows all
the operations to be described in the same style.

* All bitwise ops (and,andnot,or,xor) explicitly mention that they are
bitwise in their short description, so that they are not mistaken for
performing their logical equivalents.

* The noinstr safety of each op is explicitly described, with a
description of whether or not to use the raw_ form of the op.

There should be no functional change as a result of this patch.

Reported-by: Paul E. McKenney <paulmck@kernel.org>
Signed-off-by: Mark Rutland <mark.rutland@arm.com>
Signed-off-by: Peter Zijlstra (Intel) <peterz@infradead.org>
Reviewed-by: Kees Cook <keescook@chromium.org>
Link: https://lore.kernel.org/r/20230605070124.3741859-26-mark.rutland@arm.com

authored by

Mark Rutland and committed by
Peter Zijlstra
ad811070 8aaf297a

+5940 -7
+1847 -1
include/linux/atomic/atomic-arch-fallback.h
··· 428 428 429 429 #define raw_sync_cmpxchg arch_sync_cmpxchg 430 430 431 + /** 432 + * raw_atomic_read() - atomic load with relaxed ordering 433 + * @v: pointer to atomic_t 434 + * 435 + * Atomically loads the value of @v with relaxed ordering. 436 + * 437 + * Safe to use in noinstr code; prefer atomic_read() elsewhere. 438 + * 439 + * Return: The value loaded from @v. 440 + */ 431 441 static __always_inline int 432 442 raw_atomic_read(const atomic_t *v) 433 443 { 434 444 return arch_atomic_read(v); 435 445 } 436 446 447 + /** 448 + * raw_atomic_read_acquire() - atomic load with acquire ordering 449 + * @v: pointer to atomic_t 450 + * 451 + * Atomically loads the value of @v with acquire ordering. 452 + * 453 + * Safe to use in noinstr code; prefer atomic_read_acquire() elsewhere. 454 + * 455 + * Return: The value loaded from @v. 456 + */ 437 457 static __always_inline int 438 458 raw_atomic_read_acquire(const atomic_t *v) 439 459 { ··· 475 455 #endif 476 456 } 477 457 458 + /** 459 + * raw_atomic_set() - atomic set with relaxed ordering 460 + * @v: pointer to atomic_t 461 + * @i: int value to assign 462 + * 463 + * Atomically sets @v to @i with relaxed ordering. 464 + * 465 + * Safe to use in noinstr code; prefer atomic_set() elsewhere. 466 + * 467 + * Return: Nothing. 468 + */ 478 469 static __always_inline void 479 470 raw_atomic_set(atomic_t *v, int i) 480 471 { 481 472 arch_atomic_set(v, i); 482 473 } 483 474 475 + /** 476 + * raw_atomic_set_release() - atomic set with release ordering 477 + * @v: pointer to atomic_t 478 + * @i: int value to assign 479 + * 480 + * Atomically sets @v to @i with release ordering. 481 + * 482 + * Safe to use in noinstr code; prefer atomic_set_release() elsewhere. 483 + * 484 + * Return: Nothing. 485 + */ 484 486 static __always_inline void 485 487 raw_atomic_set_release(atomic_t *v, int i) 486 488 { ··· 520 478 #endif 521 479 } 522 480 481 + /** 482 + * raw_atomic_add() - atomic add with relaxed ordering 483 + * @i: int value to add 484 + * @v: pointer to atomic_t 485 + * 486 + * Atomically updates @v to (@v + @i) with relaxed ordering. 487 + * 488 + * Safe to use in noinstr code; prefer atomic_add() elsewhere. 489 + * 490 + * Return: Nothing. 491 + */ 523 492 static __always_inline void 524 493 raw_atomic_add(int i, atomic_t *v) 525 494 { 526 495 arch_atomic_add(i, v); 527 496 } 528 497 498 + /** 499 + * raw_atomic_add_return() - atomic add with full ordering 500 + * @i: int value to add 501 + * @v: pointer to atomic_t 502 + * 503 + * Atomically updates @v to (@v + @i) with full ordering. 504 + * 505 + * Safe to use in noinstr code; prefer atomic_add_return() elsewhere. 506 + * 507 + * Return: The updated value of @v. 508 + */ 529 509 static __always_inline int 530 510 raw_atomic_add_return(int i, atomic_t *v) 531 511 { ··· 564 500 #endif 565 501 } 566 502 503 + /** 504 + * raw_atomic_add_return_acquire() - atomic add with acquire ordering 505 + * @i: int value to add 506 + * @v: pointer to atomic_t 507 + * 508 + * Atomically updates @v to (@v + @i) with acquire ordering. 509 + * 510 + * Safe to use in noinstr code; prefer atomic_add_return_acquire() elsewhere. 511 + * 512 + * Return: The updated value of @v. 513 + */ 567 514 static __always_inline int 568 515 raw_atomic_add_return_acquire(int i, atomic_t *v) 569 516 { ··· 591 516 #endif 592 517 } 593 518 519 + /** 520 + * raw_atomic_add_return_release() - atomic add with release ordering 521 + * @i: int value to add 522 + * @v: pointer to atomic_t 523 + * 524 + * Atomically updates @v to (@v + @i) with release ordering. 525 + * 526 + * Safe to use in noinstr code; prefer atomic_add_return_release() elsewhere. 527 + * 528 + * Return: The updated value of @v. 529 + */ 594 530 static __always_inline int 595 531 raw_atomic_add_return_release(int i, atomic_t *v) 596 532 { ··· 617 531 #endif 618 532 } 619 533 534 + /** 535 + * raw_atomic_add_return_relaxed() - atomic add with relaxed ordering 536 + * @i: int value to add 537 + * @v: pointer to atomic_t 538 + * 539 + * Atomically updates @v to (@v + @i) with relaxed ordering. 540 + * 541 + * Safe to use in noinstr code; prefer atomic_add_return_relaxed() elsewhere. 542 + * 543 + * Return: The updated value of @v. 544 + */ 620 545 static __always_inline int 621 546 raw_atomic_add_return_relaxed(int i, atomic_t *v) 622 547 { ··· 640 543 #endif 641 544 } 642 545 546 + /** 547 + * raw_atomic_fetch_add() - atomic add with full ordering 548 + * @i: int value to add 549 + * @v: pointer to atomic_t 550 + * 551 + * Atomically updates @v to (@v + @i) with full ordering. 552 + * 553 + * Safe to use in noinstr code; prefer atomic_fetch_add() elsewhere. 554 + * 555 + * Return: The original value of @v. 556 + */ 643 557 static __always_inline int 644 558 raw_atomic_fetch_add(int i, atomic_t *v) 645 559 { ··· 667 559 #endif 668 560 } 669 561 562 + /** 563 + * raw_atomic_fetch_add_acquire() - atomic add with acquire ordering 564 + * @i: int value to add 565 + * @v: pointer to atomic_t 566 + * 567 + * Atomically updates @v to (@v + @i) with acquire ordering. 568 + * 569 + * Safe to use in noinstr code; prefer atomic_fetch_add_acquire() elsewhere. 570 + * 571 + * Return: The original value of @v. 572 + */ 670 573 static __always_inline int 671 574 raw_atomic_fetch_add_acquire(int i, atomic_t *v) 672 575 { ··· 694 575 #endif 695 576 } 696 577 578 + /** 579 + * raw_atomic_fetch_add_release() - atomic add with release ordering 580 + * @i: int value to add 581 + * @v: pointer to atomic_t 582 + * 583 + * Atomically updates @v to (@v + @i) with release ordering. 584 + * 585 + * Safe to use in noinstr code; prefer atomic_fetch_add_release() elsewhere. 586 + * 587 + * Return: The original value of @v. 588 + */ 697 589 static __always_inline int 698 590 raw_atomic_fetch_add_release(int i, atomic_t *v) 699 591 { ··· 720 590 #endif 721 591 } 722 592 593 + /** 594 + * raw_atomic_fetch_add_relaxed() - atomic add with relaxed ordering 595 + * @i: int value to add 596 + * @v: pointer to atomic_t 597 + * 598 + * Atomically updates @v to (@v + @i) with relaxed ordering. 599 + * 600 + * Safe to use in noinstr code; prefer atomic_fetch_add_relaxed() elsewhere. 601 + * 602 + * Return: The original value of @v. 603 + */ 723 604 static __always_inline int 724 605 raw_atomic_fetch_add_relaxed(int i, atomic_t *v) 725 606 { ··· 743 602 #endif 744 603 } 745 604 605 + /** 606 + * raw_atomic_sub() - atomic subtract with relaxed ordering 607 + * @i: int value to subtract 608 + * @v: pointer to atomic_t 609 + * 610 + * Atomically updates @v to (@v - @i) with relaxed ordering. 611 + * 612 + * Safe to use in noinstr code; prefer atomic_sub() elsewhere. 613 + * 614 + * Return: Nothing. 615 + */ 746 616 static __always_inline void 747 617 raw_atomic_sub(int i, atomic_t *v) 748 618 { 749 619 arch_atomic_sub(i, v); 750 620 } 751 621 622 + /** 623 + * raw_atomic_sub_return() - atomic subtract with full ordering 624 + * @i: int value to subtract 625 + * @v: pointer to atomic_t 626 + * 627 + * Atomically updates @v to (@v - @i) with full ordering. 628 + * 629 + * Safe to use in noinstr code; prefer atomic_sub_return() elsewhere. 630 + * 631 + * Return: The updated value of @v. 632 + */ 752 633 static __always_inline int 753 634 raw_atomic_sub_return(int i, atomic_t *v) 754 635 { ··· 787 624 #endif 788 625 } 789 626 627 + /** 628 + * raw_atomic_sub_return_acquire() - atomic subtract with acquire ordering 629 + * @i: int value to subtract 630 + * @v: pointer to atomic_t 631 + * 632 + * Atomically updates @v to (@v - @i) with acquire ordering. 633 + * 634 + * Safe to use in noinstr code; prefer atomic_sub_return_acquire() elsewhere. 635 + * 636 + * Return: The updated value of @v. 637 + */ 790 638 static __always_inline int 791 639 raw_atomic_sub_return_acquire(int i, atomic_t *v) 792 640 { ··· 814 640 #endif 815 641 } 816 642 643 + /** 644 + * raw_atomic_sub_return_release() - atomic subtract with release ordering 645 + * @i: int value to subtract 646 + * @v: pointer to atomic_t 647 + * 648 + * Atomically updates @v to (@v - @i) with release ordering. 649 + * 650 + * Safe to use in noinstr code; prefer atomic_sub_return_release() elsewhere. 651 + * 652 + * Return: The updated value of @v. 653 + */ 817 654 static __always_inline int 818 655 raw_atomic_sub_return_release(int i, atomic_t *v) 819 656 { ··· 840 655 #endif 841 656 } 842 657 658 + /** 659 + * raw_atomic_sub_return_relaxed() - atomic subtract with relaxed ordering 660 + * @i: int value to subtract 661 + * @v: pointer to atomic_t 662 + * 663 + * Atomically updates @v to (@v - @i) with relaxed ordering. 664 + * 665 + * Safe to use in noinstr code; prefer atomic_sub_return_relaxed() elsewhere. 666 + * 667 + * Return: The updated value of @v. 668 + */ 843 669 static __always_inline int 844 670 raw_atomic_sub_return_relaxed(int i, atomic_t *v) 845 671 { ··· 863 667 #endif 864 668 } 865 669 670 + /** 671 + * raw_atomic_fetch_sub() - atomic subtract with full ordering 672 + * @i: int value to subtract 673 + * @v: pointer to atomic_t 674 + * 675 + * Atomically updates @v to (@v - @i) with full ordering. 676 + * 677 + * Safe to use in noinstr code; prefer atomic_fetch_sub() elsewhere. 678 + * 679 + * Return: The original value of @v. 680 + */ 866 681 static __always_inline int 867 682 raw_atomic_fetch_sub(int i, atomic_t *v) 868 683 { ··· 890 683 #endif 891 684 } 892 685 686 + /** 687 + * raw_atomic_fetch_sub_acquire() - atomic subtract with acquire ordering 688 + * @i: int value to subtract 689 + * @v: pointer to atomic_t 690 + * 691 + * Atomically updates @v to (@v - @i) with acquire ordering. 692 + * 693 + * Safe to use in noinstr code; prefer atomic_fetch_sub_acquire() elsewhere. 694 + * 695 + * Return: The original value of @v. 696 + */ 893 697 static __always_inline int 894 698 raw_atomic_fetch_sub_acquire(int i, atomic_t *v) 895 699 { ··· 917 699 #endif 918 700 } 919 701 702 + /** 703 + * raw_atomic_fetch_sub_release() - atomic subtract with release ordering 704 + * @i: int value to subtract 705 + * @v: pointer to atomic_t 706 + * 707 + * Atomically updates @v to (@v - @i) with release ordering. 708 + * 709 + * Safe to use in noinstr code; prefer atomic_fetch_sub_release() elsewhere. 710 + * 711 + * Return: The original value of @v. 712 + */ 920 713 static __always_inline int 921 714 raw_atomic_fetch_sub_release(int i, atomic_t *v) 922 715 { ··· 943 714 #endif 944 715 } 945 716 717 + /** 718 + * raw_atomic_fetch_sub_relaxed() - atomic subtract with relaxed ordering 719 + * @i: int value to subtract 720 + * @v: pointer to atomic_t 721 + * 722 + * Atomically updates @v to (@v - @i) with relaxed ordering. 723 + * 724 + * Safe to use in noinstr code; prefer atomic_fetch_sub_relaxed() elsewhere. 725 + * 726 + * Return: The original value of @v. 727 + */ 946 728 static __always_inline int 947 729 raw_atomic_fetch_sub_relaxed(int i, atomic_t *v) 948 730 { ··· 966 726 #endif 967 727 } 968 728 729 + /** 730 + * raw_atomic_inc() - atomic increment with relaxed ordering 731 + * @v: pointer to atomic_t 732 + * 733 + * Atomically updates @v to (@v + 1) with relaxed ordering. 734 + * 735 + * Safe to use in noinstr code; prefer atomic_inc() elsewhere. 736 + * 737 + * Return: Nothing. 738 + */ 969 739 static __always_inline void 970 740 raw_atomic_inc(atomic_t *v) 971 741 { ··· 986 736 #endif 987 737 } 988 738 739 + /** 740 + * raw_atomic_inc_return() - atomic increment with full ordering 741 + * @v: pointer to atomic_t 742 + * 743 + * Atomically updates @v to (@v + 1) with full ordering. 744 + * 745 + * Safe to use in noinstr code; prefer atomic_inc_return() elsewhere. 746 + * 747 + * Return: The updated value of @v. 748 + */ 989 749 static __always_inline int 990 750 raw_atomic_inc_return(atomic_t *v) 991 751 { ··· 1012 752 #endif 1013 753 } 1014 754 755 + /** 756 + * raw_atomic_inc_return_acquire() - atomic increment with acquire ordering 757 + * @v: pointer to atomic_t 758 + * 759 + * Atomically updates @v to (@v + 1) with acquire ordering. 760 + * 761 + * Safe to use in noinstr code; prefer atomic_inc_return_acquire() elsewhere. 762 + * 763 + * Return: The updated value of @v. 764 + */ 1015 765 static __always_inline int 1016 766 raw_atomic_inc_return_acquire(atomic_t *v) 1017 767 { ··· 1038 768 #endif 1039 769 } 1040 770 771 + /** 772 + * raw_atomic_inc_return_release() - atomic increment with release ordering 773 + * @v: pointer to atomic_t 774 + * 775 + * Atomically updates @v to (@v + 1) with release ordering. 776 + * 777 + * Safe to use in noinstr code; prefer atomic_inc_return_release() elsewhere. 778 + * 779 + * Return: The updated value of @v. 780 + */ 1041 781 static __always_inline int 1042 782 raw_atomic_inc_return_release(atomic_t *v) 1043 783 { ··· 1063 783 #endif 1064 784 } 1065 785 786 + /** 787 + * raw_atomic_inc_return_relaxed() - atomic increment with relaxed ordering 788 + * @v: pointer to atomic_t 789 + * 790 + * Atomically updates @v to (@v + 1) with relaxed ordering. 791 + * 792 + * Safe to use in noinstr code; prefer atomic_inc_return_relaxed() elsewhere. 793 + * 794 + * Return: The updated value of @v. 795 + */ 1066 796 static __always_inline int 1067 797 raw_atomic_inc_return_relaxed(atomic_t *v) 1068 798 { ··· 1085 795 #endif 1086 796 } 1087 797 798 + /** 799 + * raw_atomic_fetch_inc() - atomic increment with full ordering 800 + * @v: pointer to atomic_t 801 + * 802 + * Atomically updates @v to (@v + 1) with full ordering. 803 + * 804 + * Safe to use in noinstr code; prefer atomic_fetch_inc() elsewhere. 805 + * 806 + * Return: The original value of @v. 807 + */ 1088 808 static __always_inline int 1089 809 raw_atomic_fetch_inc(atomic_t *v) 1090 810 { ··· 1111 811 #endif 1112 812 } 1113 813 814 + /** 815 + * raw_atomic_fetch_inc_acquire() - atomic increment with acquire ordering 816 + * @v: pointer to atomic_t 817 + * 818 + * Atomically updates @v to (@v + 1) with acquire ordering. 819 + * 820 + * Safe to use in noinstr code; prefer atomic_fetch_inc_acquire() elsewhere. 821 + * 822 + * Return: The original value of @v. 823 + */ 1114 824 static __always_inline int 1115 825 raw_atomic_fetch_inc_acquire(atomic_t *v) 1116 826 { ··· 1137 827 #endif 1138 828 } 1139 829 830 + /** 831 + * raw_atomic_fetch_inc_release() - atomic increment with release ordering 832 + * @v: pointer to atomic_t 833 + * 834 + * Atomically updates @v to (@v + 1) with release ordering. 835 + * 836 + * Safe to use in noinstr code; prefer atomic_fetch_inc_release() elsewhere. 837 + * 838 + * Return: The original value of @v. 839 + */ 1140 840 static __always_inline int 1141 841 raw_atomic_fetch_inc_release(atomic_t *v) 1142 842 { ··· 1162 842 #endif 1163 843 } 1164 844 845 + /** 846 + * raw_atomic_fetch_inc_relaxed() - atomic increment with relaxed ordering 847 + * @v: pointer to atomic_t 848 + * 849 + * Atomically updates @v to (@v + 1) with relaxed ordering. 850 + * 851 + * Safe to use in noinstr code; prefer atomic_fetch_inc_relaxed() elsewhere. 852 + * 853 + * Return: The original value of @v. 854 + */ 1165 855 static __always_inline int 1166 856 raw_atomic_fetch_inc_relaxed(atomic_t *v) 1167 857 { ··· 1184 854 #endif 1185 855 } 1186 856 857 + /** 858 + * raw_atomic_dec() - atomic decrement with relaxed ordering 859 + * @v: pointer to atomic_t 860 + * 861 + * Atomically updates @v to (@v - 1) with relaxed ordering. 862 + * 863 + * Safe to use in noinstr code; prefer atomic_dec() elsewhere. 864 + * 865 + * Return: Nothing. 866 + */ 1187 867 static __always_inline void 1188 868 raw_atomic_dec(atomic_t *v) 1189 869 { ··· 1204 864 #endif 1205 865 } 1206 866 867 + /** 868 + * raw_atomic_dec_return() - atomic decrement with full ordering 869 + * @v: pointer to atomic_t 870 + * 871 + * Atomically updates @v to (@v - 1) with full ordering. 872 + * 873 + * Safe to use in noinstr code; prefer atomic_dec_return() elsewhere. 874 + * 875 + * Return: The updated value of @v. 876 + */ 1207 877 static __always_inline int 1208 878 raw_atomic_dec_return(atomic_t *v) 1209 879 { ··· 1230 880 #endif 1231 881 } 1232 882 883 + /** 884 + * raw_atomic_dec_return_acquire() - atomic decrement with acquire ordering 885 + * @v: pointer to atomic_t 886 + * 887 + * Atomically updates @v to (@v - 1) with acquire ordering. 888 + * 889 + * Safe to use in noinstr code; prefer atomic_dec_return_acquire() elsewhere. 890 + * 891 + * Return: The updated value of @v. 892 + */ 1233 893 static __always_inline int 1234 894 raw_atomic_dec_return_acquire(atomic_t *v) 1235 895 { ··· 1256 896 #endif 1257 897 } 1258 898 899 + /** 900 + * raw_atomic_dec_return_release() - atomic decrement with release ordering 901 + * @v: pointer to atomic_t 902 + * 903 + * Atomically updates @v to (@v - 1) with release ordering. 904 + * 905 + * Safe to use in noinstr code; prefer atomic_dec_return_release() elsewhere. 906 + * 907 + * Return: The updated value of @v. 908 + */ 1259 909 static __always_inline int 1260 910 raw_atomic_dec_return_release(atomic_t *v) 1261 911 { ··· 1281 911 #endif 1282 912 } 1283 913 914 + /** 915 + * raw_atomic_dec_return_relaxed() - atomic decrement with relaxed ordering 916 + * @v: pointer to atomic_t 917 + * 918 + * Atomically updates @v to (@v - 1) with relaxed ordering. 919 + * 920 + * Safe to use in noinstr code; prefer atomic_dec_return_relaxed() elsewhere. 921 + * 922 + * Return: The updated value of @v. 923 + */ 1284 924 static __always_inline int 1285 925 raw_atomic_dec_return_relaxed(atomic_t *v) 1286 926 { ··· 1303 923 #endif 1304 924 } 1305 925 926 + /** 927 + * raw_atomic_fetch_dec() - atomic decrement with full ordering 928 + * @v: pointer to atomic_t 929 + * 930 + * Atomically updates @v to (@v - 1) with full ordering. 931 + * 932 + * Safe to use in noinstr code; prefer atomic_fetch_dec() elsewhere. 933 + * 934 + * Return: The original value of @v. 935 + */ 1306 936 static __always_inline int 1307 937 raw_atomic_fetch_dec(atomic_t *v) 1308 938 { ··· 1329 939 #endif 1330 940 } 1331 941 942 + /** 943 + * raw_atomic_fetch_dec_acquire() - atomic decrement with acquire ordering 944 + * @v: pointer to atomic_t 945 + * 946 + * Atomically updates @v to (@v - 1) with acquire ordering. 947 + * 948 + * Safe to use in noinstr code; prefer atomic_fetch_dec_acquire() elsewhere. 949 + * 950 + * Return: The original value of @v. 951 + */ 1332 952 static __always_inline int 1333 953 raw_atomic_fetch_dec_acquire(atomic_t *v) 1334 954 { ··· 1355 955 #endif 1356 956 } 1357 957 958 + /** 959 + * raw_atomic_fetch_dec_release() - atomic decrement with release ordering 960 + * @v: pointer to atomic_t 961 + * 962 + * Atomically updates @v to (@v - 1) with release ordering. 963 + * 964 + * Safe to use in noinstr code; prefer atomic_fetch_dec_release() elsewhere. 965 + * 966 + * Return: The original value of @v. 967 + */ 1358 968 static __always_inline int 1359 969 raw_atomic_fetch_dec_release(atomic_t *v) 1360 970 { ··· 1380 970 #endif 1381 971 } 1382 972 973 + /** 974 + * raw_atomic_fetch_dec_relaxed() - atomic decrement with relaxed ordering 975 + * @v: pointer to atomic_t 976 + * 977 + * Atomically updates @v to (@v - 1) with relaxed ordering. 978 + * 979 + * Safe to use in noinstr code; prefer atomic_fetch_dec_relaxed() elsewhere. 980 + * 981 + * Return: The original value of @v. 982 + */ 1383 983 static __always_inline int 1384 984 raw_atomic_fetch_dec_relaxed(atomic_t *v) 1385 985 { ··· 1402 982 #endif 1403 983 } 1404 984 985 + /** 986 + * raw_atomic_and() - atomic bitwise AND with relaxed ordering 987 + * @i: int value 988 + * @v: pointer to atomic_t 989 + * 990 + * Atomically updates @v to (@v & @i) with relaxed ordering. 991 + * 992 + * Safe to use in noinstr code; prefer atomic_and() elsewhere. 993 + * 994 + * Return: Nothing. 995 + */ 1405 996 static __always_inline void 1406 997 raw_atomic_and(int i, atomic_t *v) 1407 998 { 1408 999 arch_atomic_and(i, v); 1409 1000 } 1410 1001 1002 + /** 1003 + * raw_atomic_fetch_and() - atomic bitwise AND with full ordering 1004 + * @i: int value 1005 + * @v: pointer to atomic_t 1006 + * 1007 + * Atomically updates @v to (@v & @i) with full ordering. 1008 + * 1009 + * Safe to use in noinstr code; prefer atomic_fetch_and() elsewhere. 1010 + * 1011 + * Return: The original value of @v. 1012 + */ 1411 1013 static __always_inline int 1412 1014 raw_atomic_fetch_and(int i, atomic_t *v) 1413 1015 { ··· 1446 1004 #endif 1447 1005 } 1448 1006 1007 + /** 1008 + * raw_atomic_fetch_and_acquire() - atomic bitwise AND with acquire ordering 1009 + * @i: int value 1010 + * @v: pointer to atomic_t 1011 + * 1012 + * Atomically updates @v to (@v & @i) with acquire ordering. 1013 + * 1014 + * Safe to use in noinstr code; prefer atomic_fetch_and_acquire() elsewhere. 1015 + * 1016 + * Return: The original value of @v. 1017 + */ 1449 1018 static __always_inline int 1450 1019 raw_atomic_fetch_and_acquire(int i, atomic_t *v) 1451 1020 { ··· 1473 1020 #endif 1474 1021 } 1475 1022 1023 + /** 1024 + * raw_atomic_fetch_and_release() - atomic bitwise AND with release ordering 1025 + * @i: int value 1026 + * @v: pointer to atomic_t 1027 + * 1028 + * Atomically updates @v to (@v & @i) with release ordering. 1029 + * 1030 + * Safe to use in noinstr code; prefer atomic_fetch_and_release() elsewhere. 1031 + * 1032 + * Return: The original value of @v. 1033 + */ 1476 1034 static __always_inline int 1477 1035 raw_atomic_fetch_and_release(int i, atomic_t *v) 1478 1036 { ··· 1499 1035 #endif 1500 1036 } 1501 1037 1038 + /** 1039 + * raw_atomic_fetch_and_relaxed() - atomic bitwise AND with relaxed ordering 1040 + * @i: int value 1041 + * @v: pointer to atomic_t 1042 + * 1043 + * Atomically updates @v to (@v & @i) with relaxed ordering. 1044 + * 1045 + * Safe to use in noinstr code; prefer atomic_fetch_and_relaxed() elsewhere. 1046 + * 1047 + * Return: The original value of @v. 1048 + */ 1502 1049 static __always_inline int 1503 1050 raw_atomic_fetch_and_relaxed(int i, atomic_t *v) 1504 1051 { ··· 1522 1047 #endif 1523 1048 } 1524 1049 1050 + /** 1051 + * raw_atomic_andnot() - atomic bitwise AND NOT with relaxed ordering 1052 + * @i: int value 1053 + * @v: pointer to atomic_t 1054 + * 1055 + * Atomically updates @v to (@v & ~@i) with relaxed ordering. 1056 + * 1057 + * Safe to use in noinstr code; prefer atomic_andnot() elsewhere. 1058 + * 1059 + * Return: Nothing. 1060 + */ 1525 1061 static __always_inline void 1526 1062 raw_atomic_andnot(int i, atomic_t *v) 1527 1063 { ··· 1543 1057 #endif 1544 1058 } 1545 1059 1060 + /** 1061 + * raw_atomic_fetch_andnot() - atomic bitwise AND NOT with full ordering 1062 + * @i: int value 1063 + * @v: pointer to atomic_t 1064 + * 1065 + * Atomically updates @v to (@v & ~@i) with full ordering. 1066 + * 1067 + * Safe to use in noinstr code; prefer atomic_fetch_andnot() elsewhere. 1068 + * 1069 + * Return: The original value of @v. 1070 + */ 1546 1071 static __always_inline int 1547 1072 raw_atomic_fetch_andnot(int i, atomic_t *v) 1548 1073 { ··· 1570 1073 #endif 1571 1074 } 1572 1075 1076 + /** 1077 + * raw_atomic_fetch_andnot_acquire() - atomic bitwise AND NOT with acquire ordering 1078 + * @i: int value 1079 + * @v: pointer to atomic_t 1080 + * 1081 + * Atomically updates @v to (@v & ~@i) with acquire ordering. 1082 + * 1083 + * Safe to use in noinstr code; prefer atomic_fetch_andnot_acquire() elsewhere. 1084 + * 1085 + * Return: The original value of @v. 1086 + */ 1573 1087 static __always_inline int 1574 1088 raw_atomic_fetch_andnot_acquire(int i, atomic_t *v) 1575 1089 { ··· 1597 1089 #endif 1598 1090 } 1599 1091 1092 + /** 1093 + * raw_atomic_fetch_andnot_release() - atomic bitwise AND NOT with release ordering 1094 + * @i: int value 1095 + * @v: pointer to atomic_t 1096 + * 1097 + * Atomically updates @v to (@v & ~@i) with release ordering. 1098 + * 1099 + * Safe to use in noinstr code; prefer atomic_fetch_andnot_release() elsewhere. 1100 + * 1101 + * Return: The original value of @v. 1102 + */ 1600 1103 static __always_inline int 1601 1104 raw_atomic_fetch_andnot_release(int i, atomic_t *v) 1602 1105 { ··· 1623 1104 #endif 1624 1105 } 1625 1106 1107 + /** 1108 + * raw_atomic_fetch_andnot_relaxed() - atomic bitwise AND NOT with relaxed ordering 1109 + * @i: int value 1110 + * @v: pointer to atomic_t 1111 + * 1112 + * Atomically updates @v to (@v & ~@i) with relaxed ordering. 1113 + * 1114 + * Safe to use in noinstr code; prefer atomic_fetch_andnot_relaxed() elsewhere. 1115 + * 1116 + * Return: The original value of @v. 1117 + */ 1626 1118 static __always_inline int 1627 1119 raw_atomic_fetch_andnot_relaxed(int i, atomic_t *v) 1628 1120 { ··· 1646 1116 #endif 1647 1117 } 1648 1118 1119 + /** 1120 + * raw_atomic_or() - atomic bitwise OR with relaxed ordering 1121 + * @i: int value 1122 + * @v: pointer to atomic_t 1123 + * 1124 + * Atomically updates @v to (@v | @i) with relaxed ordering. 1125 + * 1126 + * Safe to use in noinstr code; prefer atomic_or() elsewhere. 1127 + * 1128 + * Return: Nothing. 1129 + */ 1649 1130 static __always_inline void 1650 1131 raw_atomic_or(int i, atomic_t *v) 1651 1132 { 1652 1133 arch_atomic_or(i, v); 1653 1134 } 1654 1135 1136 + /** 1137 + * raw_atomic_fetch_or() - atomic bitwise OR with full ordering 1138 + * @i: int value 1139 + * @v: pointer to atomic_t 1140 + * 1141 + * Atomically updates @v to (@v | @i) with full ordering. 1142 + * 1143 + * Safe to use in noinstr code; prefer atomic_fetch_or() elsewhere. 1144 + * 1145 + * Return: The original value of @v. 1146 + */ 1655 1147 static __always_inline int 1656 1148 raw_atomic_fetch_or(int i, atomic_t *v) 1657 1149 { ··· 1690 1138 #endif 1691 1139 } 1692 1140 1141 + /** 1142 + * raw_atomic_fetch_or_acquire() - atomic bitwise OR with acquire ordering 1143 + * @i: int value 1144 + * @v: pointer to atomic_t 1145 + * 1146 + * Atomically updates @v to (@v | @i) with acquire ordering. 1147 + * 1148 + * Safe to use in noinstr code; prefer atomic_fetch_or_acquire() elsewhere. 1149 + * 1150 + * Return: The original value of @v. 1151 + */ 1693 1152 static __always_inline int 1694 1153 raw_atomic_fetch_or_acquire(int i, atomic_t *v) 1695 1154 { ··· 1717 1154 #endif 1718 1155 } 1719 1156 1157 + /** 1158 + * raw_atomic_fetch_or_release() - atomic bitwise OR with release ordering 1159 + * @i: int value 1160 + * @v: pointer to atomic_t 1161 + * 1162 + * Atomically updates @v to (@v | @i) with release ordering. 1163 + * 1164 + * Safe to use in noinstr code; prefer atomic_fetch_or_release() elsewhere. 1165 + * 1166 + * Return: The original value of @v. 1167 + */ 1720 1168 static __always_inline int 1721 1169 raw_atomic_fetch_or_release(int i, atomic_t *v) 1722 1170 { ··· 1743 1169 #endif 1744 1170 } 1745 1171 1172 + /** 1173 + * raw_atomic_fetch_or_relaxed() - atomic bitwise OR with relaxed ordering 1174 + * @i: int value 1175 + * @v: pointer to atomic_t 1176 + * 1177 + * Atomically updates @v to (@v | @i) with relaxed ordering. 1178 + * 1179 + * Safe to use in noinstr code; prefer atomic_fetch_or_relaxed() elsewhere. 1180 + * 1181 + * Return: The original value of @v. 1182 + */ 1746 1183 static __always_inline int 1747 1184 raw_atomic_fetch_or_relaxed(int i, atomic_t *v) 1748 1185 { ··· 1766 1181 #endif 1767 1182 } 1768 1183 1184 + /** 1185 + * raw_atomic_xor() - atomic bitwise XOR with relaxed ordering 1186 + * @i: int value 1187 + * @v: pointer to atomic_t 1188 + * 1189 + * Atomically updates @v to (@v ^ @i) with relaxed ordering. 1190 + * 1191 + * Safe to use in noinstr code; prefer atomic_xor() elsewhere. 1192 + * 1193 + * Return: Nothing. 1194 + */ 1769 1195 static __always_inline void 1770 1196 raw_atomic_xor(int i, atomic_t *v) 1771 1197 { 1772 1198 arch_atomic_xor(i, v); 1773 1199 } 1774 1200 1201 + /** 1202 + * raw_atomic_fetch_xor() - atomic bitwise XOR with full ordering 1203 + * @i: int value 1204 + * @v: pointer to atomic_t 1205 + * 1206 + * Atomically updates @v to (@v ^ @i) with full ordering. 1207 + * 1208 + * Safe to use in noinstr code; prefer atomic_fetch_xor() elsewhere. 1209 + * 1210 + * Return: The original value of @v. 1211 + */ 1775 1212 static __always_inline int 1776 1213 raw_atomic_fetch_xor(int i, atomic_t *v) 1777 1214 { ··· 1810 1203 #endif 1811 1204 } 1812 1205 1206 + /** 1207 + * raw_atomic_fetch_xor_acquire() - atomic bitwise XOR with acquire ordering 1208 + * @i: int value 1209 + * @v: pointer to atomic_t 1210 + * 1211 + * Atomically updates @v to (@v ^ @i) with acquire ordering. 1212 + * 1213 + * Safe to use in noinstr code; prefer atomic_fetch_xor_acquire() elsewhere. 1214 + * 1215 + * Return: The original value of @v. 1216 + */ 1813 1217 static __always_inline int 1814 1218 raw_atomic_fetch_xor_acquire(int i, atomic_t *v) 1815 1219 { ··· 1837 1219 #endif 1838 1220 } 1839 1221 1222 + /** 1223 + * raw_atomic_fetch_xor_release() - atomic bitwise XOR with release ordering 1224 + * @i: int value 1225 + * @v: pointer to atomic_t 1226 + * 1227 + * Atomically updates @v to (@v ^ @i) with release ordering. 1228 + * 1229 + * Safe to use in noinstr code; prefer atomic_fetch_xor_release() elsewhere. 1230 + * 1231 + * Return: The original value of @v. 1232 + */ 1840 1233 static __always_inline int 1841 1234 raw_atomic_fetch_xor_release(int i, atomic_t *v) 1842 1235 { ··· 1863 1234 #endif 1864 1235 } 1865 1236 1237 + /** 1238 + * raw_atomic_fetch_xor_relaxed() - atomic bitwise XOR with relaxed ordering 1239 + * @i: int value 1240 + * @v: pointer to atomic_t 1241 + * 1242 + * Atomically updates @v to (@v ^ @i) with relaxed ordering. 1243 + * 1244 + * Safe to use in noinstr code; prefer atomic_fetch_xor_relaxed() elsewhere. 1245 + * 1246 + * Return: The original value of @v. 1247 + */ 1866 1248 static __always_inline int 1867 1249 raw_atomic_fetch_xor_relaxed(int i, atomic_t *v) 1868 1250 { ··· 1886 1246 #endif 1887 1247 } 1888 1248 1249 + /** 1250 + * raw_atomic_xchg() - atomic exchange with full ordering 1251 + * @v: pointer to atomic_t 1252 + * @new: int value to assign 1253 + * 1254 + * Atomically updates @v to @new with full ordering. 1255 + * 1256 + * Safe to use in noinstr code; prefer atomic_xchg() elsewhere. 1257 + * 1258 + * Return: The original value of @v. 1259 + */ 1889 1260 static __always_inline int 1890 1261 raw_atomic_xchg(atomic_t *v, int new) 1891 1262 { ··· 1913 1262 #endif 1914 1263 } 1915 1264 1265 + /** 1266 + * raw_atomic_xchg_acquire() - atomic exchange with acquire ordering 1267 + * @v: pointer to atomic_t 1268 + * @new: int value to assign 1269 + * 1270 + * Atomically updates @v to @new with acquire ordering. 1271 + * 1272 + * Safe to use in noinstr code; prefer atomic_xchg_acquire() elsewhere. 1273 + * 1274 + * Return: The original value of @v. 1275 + */ 1916 1276 static __always_inline int 1917 1277 raw_atomic_xchg_acquire(atomic_t *v, int new) 1918 1278 { ··· 1940 1278 #endif 1941 1279 } 1942 1280 1281 + /** 1282 + * raw_atomic_xchg_release() - atomic exchange with release ordering 1283 + * @v: pointer to atomic_t 1284 + * @new: int value to assign 1285 + * 1286 + * Atomically updates @v to @new with release ordering. 1287 + * 1288 + * Safe to use in noinstr code; prefer atomic_xchg_release() elsewhere. 1289 + * 1290 + * Return: The original value of @v. 1291 + */ 1943 1292 static __always_inline int 1944 1293 raw_atomic_xchg_release(atomic_t *v, int new) 1945 1294 { ··· 1966 1293 #endif 1967 1294 } 1968 1295 1296 + /** 1297 + * raw_atomic_xchg_relaxed() - atomic exchange with relaxed ordering 1298 + * @v: pointer to atomic_t 1299 + * @new: int value to assign 1300 + * 1301 + * Atomically updates @v to @new with relaxed ordering. 1302 + * 1303 + * Safe to use in noinstr code; prefer atomic_xchg_relaxed() elsewhere. 1304 + * 1305 + * Return: The original value of @v. 1306 + */ 1969 1307 static __always_inline int 1970 1308 raw_atomic_xchg_relaxed(atomic_t *v, int new) 1971 1309 { ··· 1989 1305 #endif 1990 1306 } 1991 1307 1308 + /** 1309 + * raw_atomic_cmpxchg() - atomic compare and exchange with full ordering 1310 + * @v: pointer to atomic_t 1311 + * @old: int value to compare with 1312 + * @new: int value to assign 1313 + * 1314 + * If (@v == @old), atomically updates @v to @new with full ordering. 1315 + * 1316 + * Safe to use in noinstr code; prefer atomic_cmpxchg() elsewhere. 1317 + * 1318 + * Return: The original value of @v. 1319 + */ 1992 1320 static __always_inline int 1993 1321 raw_atomic_cmpxchg(atomic_t *v, int old, int new) 1994 1322 { ··· 2017 1321 #endif 2018 1322 } 2019 1323 1324 + /** 1325 + * raw_atomic_cmpxchg_acquire() - atomic compare and exchange with acquire ordering 1326 + * @v: pointer to atomic_t 1327 + * @old: int value to compare with 1328 + * @new: int value to assign 1329 + * 1330 + * If (@v == @old), atomically updates @v to @new with acquire ordering. 1331 + * 1332 + * Safe to use in noinstr code; prefer atomic_cmpxchg_acquire() elsewhere. 1333 + * 1334 + * Return: The original value of @v. 1335 + */ 2020 1336 static __always_inline int 2021 1337 raw_atomic_cmpxchg_acquire(atomic_t *v, int old, int new) 2022 1338 { ··· 2045 1337 #endif 2046 1338 } 2047 1339 1340 + /** 1341 + * raw_atomic_cmpxchg_release() - atomic compare and exchange with release ordering 1342 + * @v: pointer to atomic_t 1343 + * @old: int value to compare with 1344 + * @new: int value to assign 1345 + * 1346 + * If (@v == @old), atomically updates @v to @new with release ordering. 1347 + * 1348 + * Safe to use in noinstr code; prefer atomic_cmpxchg_release() elsewhere. 1349 + * 1350 + * Return: The original value of @v. 1351 + */ 2048 1352 static __always_inline int 2049 1353 raw_atomic_cmpxchg_release(atomic_t *v, int old, int new) 2050 1354 { ··· 2072 1352 #endif 2073 1353 } 2074 1354 1355 + /** 1356 + * raw_atomic_cmpxchg_relaxed() - atomic compare and exchange with relaxed ordering 1357 + * @v: pointer to atomic_t 1358 + * @old: int value to compare with 1359 + * @new: int value to assign 1360 + * 1361 + * If (@v == @old), atomically updates @v to @new with relaxed ordering. 1362 + * 1363 + * Safe to use in noinstr code; prefer atomic_cmpxchg_relaxed() elsewhere. 1364 + * 1365 + * Return: The original value of @v. 1366 + */ 2075 1367 static __always_inline int 2076 1368 raw_atomic_cmpxchg_relaxed(atomic_t *v, int old, int new) 2077 1369 { ··· 2096 1364 #endif 2097 1365 } 2098 1366 1367 + /** 1368 + * raw_atomic_try_cmpxchg() - atomic compare and exchange with full ordering 1369 + * @v: pointer to atomic_t 1370 + * @old: pointer to int value to compare with 1371 + * @new: int value to assign 1372 + * 1373 + * If (@v == @old), atomically updates @v to @new with full ordering. 1374 + * Otherwise, updates @old to the current value of @v. 1375 + * 1376 + * Safe to use in noinstr code; prefer atomic_try_cmpxchg() elsewhere. 1377 + * 1378 + * Return: @true if the exchange occured, @false otherwise. 1379 + */ 2099 1380 static __always_inline bool 2100 1381 raw_atomic_try_cmpxchg(atomic_t *v, int *old, int new) 2101 1382 { ··· 2129 1384 #endif 2130 1385 } 2131 1386 1387 + /** 1388 + * raw_atomic_try_cmpxchg_acquire() - atomic compare and exchange with acquire ordering 1389 + * @v: pointer to atomic_t 1390 + * @old: pointer to int value to compare with 1391 + * @new: int value to assign 1392 + * 1393 + * If (@v == @old), atomically updates @v to @new with acquire ordering. 1394 + * Otherwise, updates @old to the current value of @v. 1395 + * 1396 + * Safe to use in noinstr code; prefer atomic_try_cmpxchg_acquire() elsewhere. 1397 + * 1398 + * Return: @true if the exchange occured, @false otherwise. 1399 + */ 2132 1400 static __always_inline bool 2133 1401 raw_atomic_try_cmpxchg_acquire(atomic_t *v, int *old, int new) 2134 1402 { ··· 2162 1404 #endif 2163 1405 } 2164 1406 1407 + /** 1408 + * raw_atomic_try_cmpxchg_release() - atomic compare and exchange with release ordering 1409 + * @v: pointer to atomic_t 1410 + * @old: pointer to int value to compare with 1411 + * @new: int value to assign 1412 + * 1413 + * If (@v == @old), atomically updates @v to @new with release ordering. 1414 + * Otherwise, updates @old to the current value of @v. 1415 + * 1416 + * Safe to use in noinstr code; prefer atomic_try_cmpxchg_release() elsewhere. 1417 + * 1418 + * Return: @true if the exchange occured, @false otherwise. 1419 + */ 2165 1420 static __always_inline bool 2166 1421 raw_atomic_try_cmpxchg_release(atomic_t *v, int *old, int new) 2167 1422 { ··· 2194 1423 #endif 2195 1424 } 2196 1425 1426 + /** 1427 + * raw_atomic_try_cmpxchg_relaxed() - atomic compare and exchange with relaxed ordering 1428 + * @v: pointer to atomic_t 1429 + * @old: pointer to int value to compare with 1430 + * @new: int value to assign 1431 + * 1432 + * If (@v == @old), atomically updates @v to @new with relaxed ordering. 1433 + * Otherwise, updates @old to the current value of @v. 1434 + * 1435 + * Safe to use in noinstr code; prefer atomic_try_cmpxchg_relaxed() elsewhere. 1436 + * 1437 + * Return: @true if the exchange occured, @false otherwise. 1438 + */ 2197 1439 static __always_inline bool 2198 1440 raw_atomic_try_cmpxchg_relaxed(atomic_t *v, int *old, int new) 2199 1441 { ··· 2223 1439 #endif 2224 1440 } 2225 1441 1442 + /** 1443 + * raw_atomic_sub_and_test() - atomic subtract and test if zero with full ordering 1444 + * @i: int value to add 1445 + * @v: pointer to atomic_t 1446 + * 1447 + * Atomically updates @v to (@v - @i) with full ordering. 1448 + * 1449 + * Safe to use in noinstr code; prefer atomic_sub_and_test() elsewhere. 1450 + * 1451 + * Return: @true if the resulting value of @v is zero, @false otherwise. 1452 + */ 2226 1453 static __always_inline bool 2227 1454 raw_atomic_sub_and_test(int i, atomic_t *v) 2228 1455 { ··· 2244 1449 #endif 2245 1450 } 2246 1451 1452 + /** 1453 + * raw_atomic_dec_and_test() - atomic decrement and test if zero with full ordering 1454 + * @v: pointer to atomic_t 1455 + * 1456 + * Atomically updates @v to (@v - 1) with full ordering. 1457 + * 1458 + * Safe to use in noinstr code; prefer atomic_dec_and_test() elsewhere. 1459 + * 1460 + * Return: @true if the resulting value of @v is zero, @false otherwise. 1461 + */ 2247 1462 static __always_inline bool 2248 1463 raw_atomic_dec_and_test(atomic_t *v) 2249 1464 { ··· 2264 1459 #endif 2265 1460 } 2266 1461 1462 + /** 1463 + * raw_atomic_inc_and_test() - atomic increment and test if zero with full ordering 1464 + * @v: pointer to atomic_t 1465 + * 1466 + * Atomically updates @v to (@v + 1) with full ordering. 1467 + * 1468 + * Safe to use in noinstr code; prefer atomic_inc_and_test() elsewhere. 1469 + * 1470 + * Return: @true if the resulting value of @v is zero, @false otherwise. 1471 + */ 2267 1472 static __always_inline bool 2268 1473 raw_atomic_inc_and_test(atomic_t *v) 2269 1474 { ··· 2284 1469 #endif 2285 1470 } 2286 1471 1472 + /** 1473 + * raw_atomic_add_negative() - atomic add and test if negative with full ordering 1474 + * @i: int value to add 1475 + * @v: pointer to atomic_t 1476 + * 1477 + * Atomically updates @v to (@v + @i) with full ordering. 1478 + * 1479 + * Safe to use in noinstr code; prefer atomic_add_negative() elsewhere. 1480 + * 1481 + * Return: @true if the resulting value of @v is negative, @false otherwise. 1482 + */ 2287 1483 static __always_inline bool 2288 1484 raw_atomic_add_negative(int i, atomic_t *v) 2289 1485 { ··· 2311 1485 #endif 2312 1486 } 2313 1487 1488 + /** 1489 + * raw_atomic_add_negative_acquire() - atomic add and test if negative with acquire ordering 1490 + * @i: int value to add 1491 + * @v: pointer to atomic_t 1492 + * 1493 + * Atomically updates @v to (@v + @i) with acquire ordering. 1494 + * 1495 + * Safe to use in noinstr code; prefer atomic_add_negative_acquire() elsewhere. 1496 + * 1497 + * Return: @true if the resulting value of @v is negative, @false otherwise. 1498 + */ 2314 1499 static __always_inline bool 2315 1500 raw_atomic_add_negative_acquire(int i, atomic_t *v) 2316 1501 { ··· 2338 1501 #endif 2339 1502 } 2340 1503 1504 + /** 1505 + * raw_atomic_add_negative_release() - atomic add and test if negative with release ordering 1506 + * @i: int value to add 1507 + * @v: pointer to atomic_t 1508 + * 1509 + * Atomically updates @v to (@v + @i) with release ordering. 1510 + * 1511 + * Safe to use in noinstr code; prefer atomic_add_negative_release() elsewhere. 1512 + * 1513 + * Return: @true if the resulting value of @v is negative, @false otherwise. 1514 + */ 2341 1515 static __always_inline bool 2342 1516 raw_atomic_add_negative_release(int i, atomic_t *v) 2343 1517 { ··· 2364 1516 #endif 2365 1517 } 2366 1518 1519 + /** 1520 + * raw_atomic_add_negative_relaxed() - atomic add and test if negative with relaxed ordering 1521 + * @i: int value to add 1522 + * @v: pointer to atomic_t 1523 + * 1524 + * Atomically updates @v to (@v + @i) with relaxed ordering. 1525 + * 1526 + * Safe to use in noinstr code; prefer atomic_add_negative_relaxed() elsewhere. 1527 + * 1528 + * Return: @true if the resulting value of @v is negative, @false otherwise. 1529 + */ 2367 1530 static __always_inline bool 2368 1531 raw_atomic_add_negative_relaxed(int i, atomic_t *v) 2369 1532 { ··· 2387 1528 #endif 2388 1529 } 2389 1530 1531 + /** 1532 + * raw_atomic_fetch_add_unless() - atomic add unless value with full ordering 1533 + * @v: pointer to atomic_t 1534 + * @a: int value to add 1535 + * @u: int value to compare with 1536 + * 1537 + * If (@v != @u), atomically updates @v to (@v + @a) with full ordering. 1538 + * 1539 + * Safe to use in noinstr code; prefer atomic_fetch_add_unless() elsewhere. 1540 + * 1541 + * Return: The original value of @v. 1542 + */ 2390 1543 static __always_inline int 2391 1544 raw_atomic_fetch_add_unless(atomic_t *v, int a, int u) 2392 1545 { ··· 2416 1545 #endif 2417 1546 } 2418 1547 1548 + /** 1549 + * raw_atomic_add_unless() - atomic add unless value with full ordering 1550 + * @v: pointer to atomic_t 1551 + * @a: int value to add 1552 + * @u: int value to compare with 1553 + * 1554 + * If (@v != @u), atomically updates @v to (@v + @a) with full ordering. 1555 + * 1556 + * Safe to use in noinstr code; prefer atomic_add_unless() elsewhere. 1557 + * 1558 + * Return: @true if @v was updated, @false otherwise. 1559 + */ 2419 1560 static __always_inline bool 2420 1561 raw_atomic_add_unless(atomic_t *v, int a, int u) 2421 1562 { ··· 2438 1555 #endif 2439 1556 } 2440 1557 1558 + /** 1559 + * raw_atomic_inc_not_zero() - atomic increment unless zero with full ordering 1560 + * @v: pointer to atomic_t 1561 + * 1562 + * If (@v != 0), atomically updates @v to (@v + 1) with full ordering. 1563 + * 1564 + * Safe to use in noinstr code; prefer atomic_inc_not_zero() elsewhere. 1565 + * 1566 + * Return: @true if @v was updated, @false otherwise. 1567 + */ 2441 1568 static __always_inline bool 2442 1569 raw_atomic_inc_not_zero(atomic_t *v) 2443 1570 { ··· 2458 1565 #endif 2459 1566 } 2460 1567 1568 + /** 1569 + * raw_atomic_inc_unless_negative() - atomic increment unless negative with full ordering 1570 + * @v: pointer to atomic_t 1571 + * 1572 + * If (@v >= 0), atomically updates @v to (@v + 1) with full ordering. 1573 + * 1574 + * Safe to use in noinstr code; prefer atomic_inc_unless_negative() elsewhere. 1575 + * 1576 + * Return: @true if @v was updated, @false otherwise. 1577 + */ 2461 1578 static __always_inline bool 2462 1579 raw_atomic_inc_unless_negative(atomic_t *v) 2463 1580 { ··· 2485 1582 #endif 2486 1583 } 2487 1584 1585 + /** 1586 + * raw_atomic_dec_unless_positive() - atomic decrement unless positive with full ordering 1587 + * @v: pointer to atomic_t 1588 + * 1589 + * If (@v <= 0), atomically updates @v to (@v - 1) with full ordering. 1590 + * 1591 + * Safe to use in noinstr code; prefer atomic_dec_unless_positive() elsewhere. 1592 + * 1593 + * Return: @true if @v was updated, @false otherwise. 1594 + */ 2488 1595 static __always_inline bool 2489 1596 raw_atomic_dec_unless_positive(atomic_t *v) 2490 1597 { ··· 2512 1599 #endif 2513 1600 } 2514 1601 1602 + /** 1603 + * raw_atomic_dec_if_positive() - atomic decrement if positive with full ordering 1604 + * @v: pointer to atomic_t 1605 + * 1606 + * If (@v > 0), atomically updates @v to (@v - 1) with full ordering. 1607 + * 1608 + * Safe to use in noinstr code; prefer atomic_dec_if_positive() elsewhere. 1609 + * 1610 + * Return: @true if @v was updated, @false otherwise. 1611 + */ 2515 1612 static __always_inline int 2516 1613 raw_atomic_dec_if_positive(atomic_t *v) 2517 1614 { ··· 2544 1621 #include <asm-generic/atomic64.h> 2545 1622 #endif 2546 1623 1624 + /** 1625 + * raw_atomic64_read() - atomic load with relaxed ordering 1626 + * @v: pointer to atomic64_t 1627 + * 1628 + * Atomically loads the value of @v with relaxed ordering. 1629 + * 1630 + * Safe to use in noinstr code; prefer atomic64_read() elsewhere. 1631 + * 1632 + * Return: The value loaded from @v. 1633 + */ 2547 1634 static __always_inline s64 2548 1635 raw_atomic64_read(const atomic64_t *v) 2549 1636 { 2550 1637 return arch_atomic64_read(v); 2551 1638 } 2552 1639 1640 + /** 1641 + * raw_atomic64_read_acquire() - atomic load with acquire ordering 1642 + * @v: pointer to atomic64_t 1643 + * 1644 + * Atomically loads the value of @v with acquire ordering. 1645 + * 1646 + * Safe to use in noinstr code; prefer atomic64_read_acquire() elsewhere. 1647 + * 1648 + * Return: The value loaded from @v. 1649 + */ 2553 1650 static __always_inline s64 2554 1651 raw_atomic64_read_acquire(const atomic64_t *v) 2555 1652 { ··· 2591 1648 #endif 2592 1649 } 2593 1650 1651 + /** 1652 + * raw_atomic64_set() - atomic set with relaxed ordering 1653 + * @v: pointer to atomic64_t 1654 + * @i: s64 value to assign 1655 + * 1656 + * Atomically sets @v to @i with relaxed ordering. 1657 + * 1658 + * Safe to use in noinstr code; prefer atomic64_set() elsewhere. 1659 + * 1660 + * Return: Nothing. 1661 + */ 2594 1662 static __always_inline void 2595 1663 raw_atomic64_set(atomic64_t *v, s64 i) 2596 1664 { 2597 1665 arch_atomic64_set(v, i); 2598 1666 } 2599 1667 1668 + /** 1669 + * raw_atomic64_set_release() - atomic set with release ordering 1670 + * @v: pointer to atomic64_t 1671 + * @i: s64 value to assign 1672 + * 1673 + * Atomically sets @v to @i with release ordering. 1674 + * 1675 + * Safe to use in noinstr code; prefer atomic64_set_release() elsewhere. 1676 + * 1677 + * Return: Nothing. 1678 + */ 2600 1679 static __always_inline void 2601 1680 raw_atomic64_set_release(atomic64_t *v, s64 i) 2602 1681 { ··· 2636 1671 #endif 2637 1672 } 2638 1673 1674 + /** 1675 + * raw_atomic64_add() - atomic add with relaxed ordering 1676 + * @i: s64 value to add 1677 + * @v: pointer to atomic64_t 1678 + * 1679 + * Atomically updates @v to (@v + @i) with relaxed ordering. 1680 + * 1681 + * Safe to use in noinstr code; prefer atomic64_add() elsewhere. 1682 + * 1683 + * Return: Nothing. 1684 + */ 2639 1685 static __always_inline void 2640 1686 raw_atomic64_add(s64 i, atomic64_t *v) 2641 1687 { 2642 1688 arch_atomic64_add(i, v); 2643 1689 } 2644 1690 1691 + /** 1692 + * raw_atomic64_add_return() - atomic add with full ordering 1693 + * @i: s64 value to add 1694 + * @v: pointer to atomic64_t 1695 + * 1696 + * Atomically updates @v to (@v + @i) with full ordering. 1697 + * 1698 + * Safe to use in noinstr code; prefer atomic64_add_return() elsewhere. 1699 + * 1700 + * Return: The updated value of @v. 1701 + */ 2645 1702 static __always_inline s64 2646 1703 raw_atomic64_add_return(s64 i, atomic64_t *v) 2647 1704 { ··· 2680 1693 #endif 2681 1694 } 2682 1695 1696 + /** 1697 + * raw_atomic64_add_return_acquire() - atomic add with acquire ordering 1698 + * @i: s64 value to add 1699 + * @v: pointer to atomic64_t 1700 + * 1701 + * Atomically updates @v to (@v + @i) with acquire ordering. 1702 + * 1703 + * Safe to use in noinstr code; prefer atomic64_add_return_acquire() elsewhere. 1704 + * 1705 + * Return: The updated value of @v. 1706 + */ 2683 1707 static __always_inline s64 2684 1708 raw_atomic64_add_return_acquire(s64 i, atomic64_t *v) 2685 1709 { ··· 2707 1709 #endif 2708 1710 } 2709 1711 1712 + /** 1713 + * raw_atomic64_add_return_release() - atomic add with release ordering 1714 + * @i: s64 value to add 1715 + * @v: pointer to atomic64_t 1716 + * 1717 + * Atomically updates @v to (@v + @i) with release ordering. 1718 + * 1719 + * Safe to use in noinstr code; prefer atomic64_add_return_release() elsewhere. 1720 + * 1721 + * Return: The updated value of @v. 1722 + */ 2710 1723 static __always_inline s64 2711 1724 raw_atomic64_add_return_release(s64 i, atomic64_t *v) 2712 1725 { ··· 2733 1724 #endif 2734 1725 } 2735 1726 1727 + /** 1728 + * raw_atomic64_add_return_relaxed() - atomic add with relaxed ordering 1729 + * @i: s64 value to add 1730 + * @v: pointer to atomic64_t 1731 + * 1732 + * Atomically updates @v to (@v + @i) with relaxed ordering. 1733 + * 1734 + * Safe to use in noinstr code; prefer atomic64_add_return_relaxed() elsewhere. 1735 + * 1736 + * Return: The updated value of @v. 1737 + */ 2736 1738 static __always_inline s64 2737 1739 raw_atomic64_add_return_relaxed(s64 i, atomic64_t *v) 2738 1740 { ··· 2756 1736 #endif 2757 1737 } 2758 1738 1739 + /** 1740 + * raw_atomic64_fetch_add() - atomic add with full ordering 1741 + * @i: s64 value to add 1742 + * @v: pointer to atomic64_t 1743 + * 1744 + * Atomically updates @v to (@v + @i) with full ordering. 1745 + * 1746 + * Safe to use in noinstr code; prefer atomic64_fetch_add() elsewhere. 1747 + * 1748 + * Return: The original value of @v. 1749 + */ 2759 1750 static __always_inline s64 2760 1751 raw_atomic64_fetch_add(s64 i, atomic64_t *v) 2761 1752 { ··· 2783 1752 #endif 2784 1753 } 2785 1754 1755 + /** 1756 + * raw_atomic64_fetch_add_acquire() - atomic add with acquire ordering 1757 + * @i: s64 value to add 1758 + * @v: pointer to atomic64_t 1759 + * 1760 + * Atomically updates @v to (@v + @i) with acquire ordering. 1761 + * 1762 + * Safe to use in noinstr code; prefer atomic64_fetch_add_acquire() elsewhere. 1763 + * 1764 + * Return: The original value of @v. 1765 + */ 2786 1766 static __always_inline s64 2787 1767 raw_atomic64_fetch_add_acquire(s64 i, atomic64_t *v) 2788 1768 { ··· 2810 1768 #endif 2811 1769 } 2812 1770 1771 + /** 1772 + * raw_atomic64_fetch_add_release() - atomic add with release ordering 1773 + * @i: s64 value to add 1774 + * @v: pointer to atomic64_t 1775 + * 1776 + * Atomically updates @v to (@v + @i) with release ordering. 1777 + * 1778 + * Safe to use in noinstr code; prefer atomic64_fetch_add_release() elsewhere. 1779 + * 1780 + * Return: The original value of @v. 1781 + */ 2813 1782 static __always_inline s64 2814 1783 raw_atomic64_fetch_add_release(s64 i, atomic64_t *v) 2815 1784 { ··· 2836 1783 #endif 2837 1784 } 2838 1785 1786 + /** 1787 + * raw_atomic64_fetch_add_relaxed() - atomic add with relaxed ordering 1788 + * @i: s64 value to add 1789 + * @v: pointer to atomic64_t 1790 + * 1791 + * Atomically updates @v to (@v + @i) with relaxed ordering. 1792 + * 1793 + * Safe to use in noinstr code; prefer atomic64_fetch_add_relaxed() elsewhere. 1794 + * 1795 + * Return: The original value of @v. 1796 + */ 2839 1797 static __always_inline s64 2840 1798 raw_atomic64_fetch_add_relaxed(s64 i, atomic64_t *v) 2841 1799 { ··· 2859 1795 #endif 2860 1796 } 2861 1797 1798 + /** 1799 + * raw_atomic64_sub() - atomic subtract with relaxed ordering 1800 + * @i: s64 value to subtract 1801 + * @v: pointer to atomic64_t 1802 + * 1803 + * Atomically updates @v to (@v - @i) with relaxed ordering. 1804 + * 1805 + * Safe to use in noinstr code; prefer atomic64_sub() elsewhere. 1806 + * 1807 + * Return: Nothing. 1808 + */ 2862 1809 static __always_inline void 2863 1810 raw_atomic64_sub(s64 i, atomic64_t *v) 2864 1811 { 2865 1812 arch_atomic64_sub(i, v); 2866 1813 } 2867 1814 1815 + /** 1816 + * raw_atomic64_sub_return() - atomic subtract with full ordering 1817 + * @i: s64 value to subtract 1818 + * @v: pointer to atomic64_t 1819 + * 1820 + * Atomically updates @v to (@v - @i) with full ordering. 1821 + * 1822 + * Safe to use in noinstr code; prefer atomic64_sub_return() elsewhere. 1823 + * 1824 + * Return: The updated value of @v. 1825 + */ 2868 1826 static __always_inline s64 2869 1827 raw_atomic64_sub_return(s64 i, atomic64_t *v) 2870 1828 { ··· 2903 1817 #endif 2904 1818 } 2905 1819 1820 + /** 1821 + * raw_atomic64_sub_return_acquire() - atomic subtract with acquire ordering 1822 + * @i: s64 value to subtract 1823 + * @v: pointer to atomic64_t 1824 + * 1825 + * Atomically updates @v to (@v - @i) with acquire ordering. 1826 + * 1827 + * Safe to use in noinstr code; prefer atomic64_sub_return_acquire() elsewhere. 1828 + * 1829 + * Return: The updated value of @v. 1830 + */ 2906 1831 static __always_inline s64 2907 1832 raw_atomic64_sub_return_acquire(s64 i, atomic64_t *v) 2908 1833 { ··· 2930 1833 #endif 2931 1834 } 2932 1835 1836 + /** 1837 + * raw_atomic64_sub_return_release() - atomic subtract with release ordering 1838 + * @i: s64 value to subtract 1839 + * @v: pointer to atomic64_t 1840 + * 1841 + * Atomically updates @v to (@v - @i) with release ordering. 1842 + * 1843 + * Safe to use in noinstr code; prefer atomic64_sub_return_release() elsewhere. 1844 + * 1845 + * Return: The updated value of @v. 1846 + */ 2933 1847 static __always_inline s64 2934 1848 raw_atomic64_sub_return_release(s64 i, atomic64_t *v) 2935 1849 { ··· 2956 1848 #endif 2957 1849 } 2958 1850 1851 + /** 1852 + * raw_atomic64_sub_return_relaxed() - atomic subtract with relaxed ordering 1853 + * @i: s64 value to subtract 1854 + * @v: pointer to atomic64_t 1855 + * 1856 + * Atomically updates @v to (@v - @i) with relaxed ordering. 1857 + * 1858 + * Safe to use in noinstr code; prefer atomic64_sub_return_relaxed() elsewhere. 1859 + * 1860 + * Return: The updated value of @v. 1861 + */ 2959 1862 static __always_inline s64 2960 1863 raw_atomic64_sub_return_relaxed(s64 i, atomic64_t *v) 2961 1864 { ··· 2979 1860 #endif 2980 1861 } 2981 1862 1863 + /** 1864 + * raw_atomic64_fetch_sub() - atomic subtract with full ordering 1865 + * @i: s64 value to subtract 1866 + * @v: pointer to atomic64_t 1867 + * 1868 + * Atomically updates @v to (@v - @i) with full ordering. 1869 + * 1870 + * Safe to use in noinstr code; prefer atomic64_fetch_sub() elsewhere. 1871 + * 1872 + * Return: The original value of @v. 1873 + */ 2982 1874 static __always_inline s64 2983 1875 raw_atomic64_fetch_sub(s64 i, atomic64_t *v) 2984 1876 { ··· 3006 1876 #endif 3007 1877 } 3008 1878 1879 + /** 1880 + * raw_atomic64_fetch_sub_acquire() - atomic subtract with acquire ordering 1881 + * @i: s64 value to subtract 1882 + * @v: pointer to atomic64_t 1883 + * 1884 + * Atomically updates @v to (@v - @i) with acquire ordering. 1885 + * 1886 + * Safe to use in noinstr code; prefer atomic64_fetch_sub_acquire() elsewhere. 1887 + * 1888 + * Return: The original value of @v. 1889 + */ 3009 1890 static __always_inline s64 3010 1891 raw_atomic64_fetch_sub_acquire(s64 i, atomic64_t *v) 3011 1892 { ··· 3033 1892 #endif 3034 1893 } 3035 1894 1895 + /** 1896 + * raw_atomic64_fetch_sub_release() - atomic subtract with release ordering 1897 + * @i: s64 value to subtract 1898 + * @v: pointer to atomic64_t 1899 + * 1900 + * Atomically updates @v to (@v - @i) with release ordering. 1901 + * 1902 + * Safe to use in noinstr code; prefer atomic64_fetch_sub_release() elsewhere. 1903 + * 1904 + * Return: The original value of @v. 1905 + */ 3036 1906 static __always_inline s64 3037 1907 raw_atomic64_fetch_sub_release(s64 i, atomic64_t *v) 3038 1908 { ··· 3059 1907 #endif 3060 1908 } 3061 1909 1910 + /** 1911 + * raw_atomic64_fetch_sub_relaxed() - atomic subtract with relaxed ordering 1912 + * @i: s64 value to subtract 1913 + * @v: pointer to atomic64_t 1914 + * 1915 + * Atomically updates @v to (@v - @i) with relaxed ordering. 1916 + * 1917 + * Safe to use in noinstr code; prefer atomic64_fetch_sub_relaxed() elsewhere. 1918 + * 1919 + * Return: The original value of @v. 1920 + */ 3062 1921 static __always_inline s64 3063 1922 raw_atomic64_fetch_sub_relaxed(s64 i, atomic64_t *v) 3064 1923 { ··· 3082 1919 #endif 3083 1920 } 3084 1921 1922 + /** 1923 + * raw_atomic64_inc() - atomic increment with relaxed ordering 1924 + * @v: pointer to atomic64_t 1925 + * 1926 + * Atomically updates @v to (@v + 1) with relaxed ordering. 1927 + * 1928 + * Safe to use in noinstr code; prefer atomic64_inc() elsewhere. 1929 + * 1930 + * Return: Nothing. 1931 + */ 3085 1932 static __always_inline void 3086 1933 raw_atomic64_inc(atomic64_t *v) 3087 1934 { ··· 3102 1929 #endif 3103 1930 } 3104 1931 1932 + /** 1933 + * raw_atomic64_inc_return() - atomic increment with full ordering 1934 + * @v: pointer to atomic64_t 1935 + * 1936 + * Atomically updates @v to (@v + 1) with full ordering. 1937 + * 1938 + * Safe to use in noinstr code; prefer atomic64_inc_return() elsewhere. 1939 + * 1940 + * Return: The updated value of @v. 1941 + */ 3105 1942 static __always_inline s64 3106 1943 raw_atomic64_inc_return(atomic64_t *v) 3107 1944 { ··· 3128 1945 #endif 3129 1946 } 3130 1947 1948 + /** 1949 + * raw_atomic64_inc_return_acquire() - atomic increment with acquire ordering 1950 + * @v: pointer to atomic64_t 1951 + * 1952 + * Atomically updates @v to (@v + 1) with acquire ordering. 1953 + * 1954 + * Safe to use in noinstr code; prefer atomic64_inc_return_acquire() elsewhere. 1955 + * 1956 + * Return: The updated value of @v. 1957 + */ 3131 1958 static __always_inline s64 3132 1959 raw_atomic64_inc_return_acquire(atomic64_t *v) 3133 1960 { ··· 3154 1961 #endif 3155 1962 } 3156 1963 1964 + /** 1965 + * raw_atomic64_inc_return_release() - atomic increment with release ordering 1966 + * @v: pointer to atomic64_t 1967 + * 1968 + * Atomically updates @v to (@v + 1) with release ordering. 1969 + * 1970 + * Safe to use in noinstr code; prefer atomic64_inc_return_release() elsewhere. 1971 + * 1972 + * Return: The updated value of @v. 1973 + */ 3157 1974 static __always_inline s64 3158 1975 raw_atomic64_inc_return_release(atomic64_t *v) 3159 1976 { ··· 3179 1976 #endif 3180 1977 } 3181 1978 1979 + /** 1980 + * raw_atomic64_inc_return_relaxed() - atomic increment with relaxed ordering 1981 + * @v: pointer to atomic64_t 1982 + * 1983 + * Atomically updates @v to (@v + 1) with relaxed ordering. 1984 + * 1985 + * Safe to use in noinstr code; prefer atomic64_inc_return_relaxed() elsewhere. 1986 + * 1987 + * Return: The updated value of @v. 1988 + */ 3182 1989 static __always_inline s64 3183 1990 raw_atomic64_inc_return_relaxed(atomic64_t *v) 3184 1991 { ··· 3201 1988 #endif 3202 1989 } 3203 1990 1991 + /** 1992 + * raw_atomic64_fetch_inc() - atomic increment with full ordering 1993 + * @v: pointer to atomic64_t 1994 + * 1995 + * Atomically updates @v to (@v + 1) with full ordering. 1996 + * 1997 + * Safe to use in noinstr code; prefer atomic64_fetch_inc() elsewhere. 1998 + * 1999 + * Return: The original value of @v. 2000 + */ 3204 2001 static __always_inline s64 3205 2002 raw_atomic64_fetch_inc(atomic64_t *v) 3206 2003 { ··· 3227 2004 #endif 3228 2005 } 3229 2006 2007 + /** 2008 + * raw_atomic64_fetch_inc_acquire() - atomic increment with acquire ordering 2009 + * @v: pointer to atomic64_t 2010 + * 2011 + * Atomically updates @v to (@v + 1) with acquire ordering. 2012 + * 2013 + * Safe to use in noinstr code; prefer atomic64_fetch_inc_acquire() elsewhere. 2014 + * 2015 + * Return: The original value of @v. 2016 + */ 3230 2017 static __always_inline s64 3231 2018 raw_atomic64_fetch_inc_acquire(atomic64_t *v) 3232 2019 { ··· 3253 2020 #endif 3254 2021 } 3255 2022 2023 + /** 2024 + * raw_atomic64_fetch_inc_release() - atomic increment with release ordering 2025 + * @v: pointer to atomic64_t 2026 + * 2027 + * Atomically updates @v to (@v + 1) with release ordering. 2028 + * 2029 + * Safe to use in noinstr code; prefer atomic64_fetch_inc_release() elsewhere. 2030 + * 2031 + * Return: The original value of @v. 2032 + */ 3256 2033 static __always_inline s64 3257 2034 raw_atomic64_fetch_inc_release(atomic64_t *v) 3258 2035 { ··· 3278 2035 #endif 3279 2036 } 3280 2037 2038 + /** 2039 + * raw_atomic64_fetch_inc_relaxed() - atomic increment with relaxed ordering 2040 + * @v: pointer to atomic64_t 2041 + * 2042 + * Atomically updates @v to (@v + 1) with relaxed ordering. 2043 + * 2044 + * Safe to use in noinstr code; prefer atomic64_fetch_inc_relaxed() elsewhere. 2045 + * 2046 + * Return: The original value of @v. 2047 + */ 3281 2048 static __always_inline s64 3282 2049 raw_atomic64_fetch_inc_relaxed(atomic64_t *v) 3283 2050 { ··· 3300 2047 #endif 3301 2048 } 3302 2049 2050 + /** 2051 + * raw_atomic64_dec() - atomic decrement with relaxed ordering 2052 + * @v: pointer to atomic64_t 2053 + * 2054 + * Atomically updates @v to (@v - 1) with relaxed ordering. 2055 + * 2056 + * Safe to use in noinstr code; prefer atomic64_dec() elsewhere. 2057 + * 2058 + * Return: Nothing. 2059 + */ 3303 2060 static __always_inline void 3304 2061 raw_atomic64_dec(atomic64_t *v) 3305 2062 { ··· 3320 2057 #endif 3321 2058 } 3322 2059 2060 + /** 2061 + * raw_atomic64_dec_return() - atomic decrement with full ordering 2062 + * @v: pointer to atomic64_t 2063 + * 2064 + * Atomically updates @v to (@v - 1) with full ordering. 2065 + * 2066 + * Safe to use in noinstr code; prefer atomic64_dec_return() elsewhere. 2067 + * 2068 + * Return: The updated value of @v. 2069 + */ 3323 2070 static __always_inline s64 3324 2071 raw_atomic64_dec_return(atomic64_t *v) 3325 2072 { ··· 3346 2073 #endif 3347 2074 } 3348 2075 2076 + /** 2077 + * raw_atomic64_dec_return_acquire() - atomic decrement with acquire ordering 2078 + * @v: pointer to atomic64_t 2079 + * 2080 + * Atomically updates @v to (@v - 1) with acquire ordering. 2081 + * 2082 + * Safe to use in noinstr code; prefer atomic64_dec_return_acquire() elsewhere. 2083 + * 2084 + * Return: The updated value of @v. 2085 + */ 3349 2086 static __always_inline s64 3350 2087 raw_atomic64_dec_return_acquire(atomic64_t *v) 3351 2088 { ··· 3372 2089 #endif 3373 2090 } 3374 2091 2092 + /** 2093 + * raw_atomic64_dec_return_release() - atomic decrement with release ordering 2094 + * @v: pointer to atomic64_t 2095 + * 2096 + * Atomically updates @v to (@v - 1) with release ordering. 2097 + * 2098 + * Safe to use in noinstr code; prefer atomic64_dec_return_release() elsewhere. 2099 + * 2100 + * Return: The updated value of @v. 2101 + */ 3375 2102 static __always_inline s64 3376 2103 raw_atomic64_dec_return_release(atomic64_t *v) 3377 2104 { ··· 3397 2104 #endif 3398 2105 } 3399 2106 2107 + /** 2108 + * raw_atomic64_dec_return_relaxed() - atomic decrement with relaxed ordering 2109 + * @v: pointer to atomic64_t 2110 + * 2111 + * Atomically updates @v to (@v - 1) with relaxed ordering. 2112 + * 2113 + * Safe to use in noinstr code; prefer atomic64_dec_return_relaxed() elsewhere. 2114 + * 2115 + * Return: The updated value of @v. 2116 + */ 3400 2117 static __always_inline s64 3401 2118 raw_atomic64_dec_return_relaxed(atomic64_t *v) 3402 2119 { ··· 3419 2116 #endif 3420 2117 } 3421 2118 2119 + /** 2120 + * raw_atomic64_fetch_dec() - atomic decrement with full ordering 2121 + * @v: pointer to atomic64_t 2122 + * 2123 + * Atomically updates @v to (@v - 1) with full ordering. 2124 + * 2125 + * Safe to use in noinstr code; prefer atomic64_fetch_dec() elsewhere. 2126 + * 2127 + * Return: The original value of @v. 2128 + */ 3422 2129 static __always_inline s64 3423 2130 raw_atomic64_fetch_dec(atomic64_t *v) 3424 2131 { ··· 3445 2132 #endif 3446 2133 } 3447 2134 2135 + /** 2136 + * raw_atomic64_fetch_dec_acquire() - atomic decrement with acquire ordering 2137 + * @v: pointer to atomic64_t 2138 + * 2139 + * Atomically updates @v to (@v - 1) with acquire ordering. 2140 + * 2141 + * Safe to use in noinstr code; prefer atomic64_fetch_dec_acquire() elsewhere. 2142 + * 2143 + * Return: The original value of @v. 2144 + */ 3448 2145 static __always_inline s64 3449 2146 raw_atomic64_fetch_dec_acquire(atomic64_t *v) 3450 2147 { ··· 3471 2148 #endif 3472 2149 } 3473 2150 2151 + /** 2152 + * raw_atomic64_fetch_dec_release() - atomic decrement with release ordering 2153 + * @v: pointer to atomic64_t 2154 + * 2155 + * Atomically updates @v to (@v - 1) with release ordering. 2156 + * 2157 + * Safe to use in noinstr code; prefer atomic64_fetch_dec_release() elsewhere. 2158 + * 2159 + * Return: The original value of @v. 2160 + */ 3474 2161 static __always_inline s64 3475 2162 raw_atomic64_fetch_dec_release(atomic64_t *v) 3476 2163 { ··· 3496 2163 #endif 3497 2164 } 3498 2165 2166 + /** 2167 + * raw_atomic64_fetch_dec_relaxed() - atomic decrement with relaxed ordering 2168 + * @v: pointer to atomic64_t 2169 + * 2170 + * Atomically updates @v to (@v - 1) with relaxed ordering. 2171 + * 2172 + * Safe to use in noinstr code; prefer atomic64_fetch_dec_relaxed() elsewhere. 2173 + * 2174 + * Return: The original value of @v. 2175 + */ 3499 2176 static __always_inline s64 3500 2177 raw_atomic64_fetch_dec_relaxed(atomic64_t *v) 3501 2178 { ··· 3518 2175 #endif 3519 2176 } 3520 2177 2178 + /** 2179 + * raw_atomic64_and() - atomic bitwise AND with relaxed ordering 2180 + * @i: s64 value 2181 + * @v: pointer to atomic64_t 2182 + * 2183 + * Atomically updates @v to (@v & @i) with relaxed ordering. 2184 + * 2185 + * Safe to use in noinstr code; prefer atomic64_and() elsewhere. 2186 + * 2187 + * Return: Nothing. 2188 + */ 3521 2189 static __always_inline void 3522 2190 raw_atomic64_and(s64 i, atomic64_t *v) 3523 2191 { 3524 2192 arch_atomic64_and(i, v); 3525 2193 } 3526 2194 2195 + /** 2196 + * raw_atomic64_fetch_and() - atomic bitwise AND with full ordering 2197 + * @i: s64 value 2198 + * @v: pointer to atomic64_t 2199 + * 2200 + * Atomically updates @v to (@v & @i) with full ordering. 2201 + * 2202 + * Safe to use in noinstr code; prefer atomic64_fetch_and() elsewhere. 2203 + * 2204 + * Return: The original value of @v. 2205 + */ 3527 2206 static __always_inline s64 3528 2207 raw_atomic64_fetch_and(s64 i, atomic64_t *v) 3529 2208 { ··· 3562 2197 #endif 3563 2198 } 3564 2199 2200 + /** 2201 + * raw_atomic64_fetch_and_acquire() - atomic bitwise AND with acquire ordering 2202 + * @i: s64 value 2203 + * @v: pointer to atomic64_t 2204 + * 2205 + * Atomically updates @v to (@v & @i) with acquire ordering. 2206 + * 2207 + * Safe to use in noinstr code; prefer atomic64_fetch_and_acquire() elsewhere. 2208 + * 2209 + * Return: The original value of @v. 2210 + */ 3565 2211 static __always_inline s64 3566 2212 raw_atomic64_fetch_and_acquire(s64 i, atomic64_t *v) 3567 2213 { ··· 3589 2213 #endif 3590 2214 } 3591 2215 2216 + /** 2217 + * raw_atomic64_fetch_and_release() - atomic bitwise AND with release ordering 2218 + * @i: s64 value 2219 + * @v: pointer to atomic64_t 2220 + * 2221 + * Atomically updates @v to (@v & @i) with release ordering. 2222 + * 2223 + * Safe to use in noinstr code; prefer atomic64_fetch_and_release() elsewhere. 2224 + * 2225 + * Return: The original value of @v. 2226 + */ 3592 2227 static __always_inline s64 3593 2228 raw_atomic64_fetch_and_release(s64 i, atomic64_t *v) 3594 2229 { ··· 3615 2228 #endif 3616 2229 } 3617 2230 2231 + /** 2232 + * raw_atomic64_fetch_and_relaxed() - atomic bitwise AND with relaxed ordering 2233 + * @i: s64 value 2234 + * @v: pointer to atomic64_t 2235 + * 2236 + * Atomically updates @v to (@v & @i) with relaxed ordering. 2237 + * 2238 + * Safe to use in noinstr code; prefer atomic64_fetch_and_relaxed() elsewhere. 2239 + * 2240 + * Return: The original value of @v. 2241 + */ 3618 2242 static __always_inline s64 3619 2243 raw_atomic64_fetch_and_relaxed(s64 i, atomic64_t *v) 3620 2244 { ··· 3638 2240 #endif 3639 2241 } 3640 2242 2243 + /** 2244 + * raw_atomic64_andnot() - atomic bitwise AND NOT with relaxed ordering 2245 + * @i: s64 value 2246 + * @v: pointer to atomic64_t 2247 + * 2248 + * Atomically updates @v to (@v & ~@i) with relaxed ordering. 2249 + * 2250 + * Safe to use in noinstr code; prefer atomic64_andnot() elsewhere. 2251 + * 2252 + * Return: Nothing. 2253 + */ 3641 2254 static __always_inline void 3642 2255 raw_atomic64_andnot(s64 i, atomic64_t *v) 3643 2256 { ··· 3659 2250 #endif 3660 2251 } 3661 2252 2253 + /** 2254 + * raw_atomic64_fetch_andnot() - atomic bitwise AND NOT with full ordering 2255 + * @i: s64 value 2256 + * @v: pointer to atomic64_t 2257 + * 2258 + * Atomically updates @v to (@v & ~@i) with full ordering. 2259 + * 2260 + * Safe to use in noinstr code; prefer atomic64_fetch_andnot() elsewhere. 2261 + * 2262 + * Return: The original value of @v. 2263 + */ 3662 2264 static __always_inline s64 3663 2265 raw_atomic64_fetch_andnot(s64 i, atomic64_t *v) 3664 2266 { ··· 3686 2266 #endif 3687 2267 } 3688 2268 2269 + /** 2270 + * raw_atomic64_fetch_andnot_acquire() - atomic bitwise AND NOT with acquire ordering 2271 + * @i: s64 value 2272 + * @v: pointer to atomic64_t 2273 + * 2274 + * Atomically updates @v to (@v & ~@i) with acquire ordering. 2275 + * 2276 + * Safe to use in noinstr code; prefer atomic64_fetch_andnot_acquire() elsewhere. 2277 + * 2278 + * Return: The original value of @v. 2279 + */ 3689 2280 static __always_inline s64 3690 2281 raw_atomic64_fetch_andnot_acquire(s64 i, atomic64_t *v) 3691 2282 { ··· 3713 2282 #endif 3714 2283 } 3715 2284 2285 + /** 2286 + * raw_atomic64_fetch_andnot_release() - atomic bitwise AND NOT with release ordering 2287 + * @i: s64 value 2288 + * @v: pointer to atomic64_t 2289 + * 2290 + * Atomically updates @v to (@v & ~@i) with release ordering. 2291 + * 2292 + * Safe to use in noinstr code; prefer atomic64_fetch_andnot_release() elsewhere. 2293 + * 2294 + * Return: The original value of @v. 2295 + */ 3716 2296 static __always_inline s64 3717 2297 raw_atomic64_fetch_andnot_release(s64 i, atomic64_t *v) 3718 2298 { ··· 3739 2297 #endif 3740 2298 } 3741 2299 2300 + /** 2301 + * raw_atomic64_fetch_andnot_relaxed() - atomic bitwise AND NOT with relaxed ordering 2302 + * @i: s64 value 2303 + * @v: pointer to atomic64_t 2304 + * 2305 + * Atomically updates @v to (@v & ~@i) with relaxed ordering. 2306 + * 2307 + * Safe to use in noinstr code; prefer atomic64_fetch_andnot_relaxed() elsewhere. 2308 + * 2309 + * Return: The original value of @v. 2310 + */ 3742 2311 static __always_inline s64 3743 2312 raw_atomic64_fetch_andnot_relaxed(s64 i, atomic64_t *v) 3744 2313 { ··· 3762 2309 #endif 3763 2310 } 3764 2311 2312 + /** 2313 + * raw_atomic64_or() - atomic bitwise OR with relaxed ordering 2314 + * @i: s64 value 2315 + * @v: pointer to atomic64_t 2316 + * 2317 + * Atomically updates @v to (@v | @i) with relaxed ordering. 2318 + * 2319 + * Safe to use in noinstr code; prefer atomic64_or() elsewhere. 2320 + * 2321 + * Return: Nothing. 2322 + */ 3765 2323 static __always_inline void 3766 2324 raw_atomic64_or(s64 i, atomic64_t *v) 3767 2325 { 3768 2326 arch_atomic64_or(i, v); 3769 2327 } 3770 2328 2329 + /** 2330 + * raw_atomic64_fetch_or() - atomic bitwise OR with full ordering 2331 + * @i: s64 value 2332 + * @v: pointer to atomic64_t 2333 + * 2334 + * Atomically updates @v to (@v | @i) with full ordering. 2335 + * 2336 + * Safe to use in noinstr code; prefer atomic64_fetch_or() elsewhere. 2337 + * 2338 + * Return: The original value of @v. 2339 + */ 3771 2340 static __always_inline s64 3772 2341 raw_atomic64_fetch_or(s64 i, atomic64_t *v) 3773 2342 { ··· 3806 2331 #endif 3807 2332 } 3808 2333 2334 + /** 2335 + * raw_atomic64_fetch_or_acquire() - atomic bitwise OR with acquire ordering 2336 + * @i: s64 value 2337 + * @v: pointer to atomic64_t 2338 + * 2339 + * Atomically updates @v to (@v | @i) with acquire ordering. 2340 + * 2341 + * Safe to use in noinstr code; prefer atomic64_fetch_or_acquire() elsewhere. 2342 + * 2343 + * Return: The original value of @v. 2344 + */ 3809 2345 static __always_inline s64 3810 2346 raw_atomic64_fetch_or_acquire(s64 i, atomic64_t *v) 3811 2347 { ··· 3833 2347 #endif 3834 2348 } 3835 2349 2350 + /** 2351 + * raw_atomic64_fetch_or_release() - atomic bitwise OR with release ordering 2352 + * @i: s64 value 2353 + * @v: pointer to atomic64_t 2354 + * 2355 + * Atomically updates @v to (@v | @i) with release ordering. 2356 + * 2357 + * Safe to use in noinstr code; prefer atomic64_fetch_or_release() elsewhere. 2358 + * 2359 + * Return: The original value of @v. 2360 + */ 3836 2361 static __always_inline s64 3837 2362 raw_atomic64_fetch_or_release(s64 i, atomic64_t *v) 3838 2363 { ··· 3859 2362 #endif 3860 2363 } 3861 2364 2365 + /** 2366 + * raw_atomic64_fetch_or_relaxed() - atomic bitwise OR with relaxed ordering 2367 + * @i: s64 value 2368 + * @v: pointer to atomic64_t 2369 + * 2370 + * Atomically updates @v to (@v | @i) with relaxed ordering. 2371 + * 2372 + * Safe to use in noinstr code; prefer atomic64_fetch_or_relaxed() elsewhere. 2373 + * 2374 + * Return: The original value of @v. 2375 + */ 3862 2376 static __always_inline s64 3863 2377 raw_atomic64_fetch_or_relaxed(s64 i, atomic64_t *v) 3864 2378 { ··· 3882 2374 #endif 3883 2375 } 3884 2376 2377 + /** 2378 + * raw_atomic64_xor() - atomic bitwise XOR with relaxed ordering 2379 + * @i: s64 value 2380 + * @v: pointer to atomic64_t 2381 + * 2382 + * Atomically updates @v to (@v ^ @i) with relaxed ordering. 2383 + * 2384 + * Safe to use in noinstr code; prefer atomic64_xor() elsewhere. 2385 + * 2386 + * Return: Nothing. 2387 + */ 3885 2388 static __always_inline void 3886 2389 raw_atomic64_xor(s64 i, atomic64_t *v) 3887 2390 { 3888 2391 arch_atomic64_xor(i, v); 3889 2392 } 3890 2393 2394 + /** 2395 + * raw_atomic64_fetch_xor() - atomic bitwise XOR with full ordering 2396 + * @i: s64 value 2397 + * @v: pointer to atomic64_t 2398 + * 2399 + * Atomically updates @v to (@v ^ @i) with full ordering. 2400 + * 2401 + * Safe to use in noinstr code; prefer atomic64_fetch_xor() elsewhere. 2402 + * 2403 + * Return: The original value of @v. 2404 + */ 3891 2405 static __always_inline s64 3892 2406 raw_atomic64_fetch_xor(s64 i, atomic64_t *v) 3893 2407 { ··· 3926 2396 #endif 3927 2397 } 3928 2398 2399 + /** 2400 + * raw_atomic64_fetch_xor_acquire() - atomic bitwise XOR with acquire ordering 2401 + * @i: s64 value 2402 + * @v: pointer to atomic64_t 2403 + * 2404 + * Atomically updates @v to (@v ^ @i) with acquire ordering. 2405 + * 2406 + * Safe to use in noinstr code; prefer atomic64_fetch_xor_acquire() elsewhere. 2407 + * 2408 + * Return: The original value of @v. 2409 + */ 3929 2410 static __always_inline s64 3930 2411 raw_atomic64_fetch_xor_acquire(s64 i, atomic64_t *v) 3931 2412 { ··· 3953 2412 #endif 3954 2413 } 3955 2414 2415 + /** 2416 + * raw_atomic64_fetch_xor_release() - atomic bitwise XOR with release ordering 2417 + * @i: s64 value 2418 + * @v: pointer to atomic64_t 2419 + * 2420 + * Atomically updates @v to (@v ^ @i) with release ordering. 2421 + * 2422 + * Safe to use in noinstr code; prefer atomic64_fetch_xor_release() elsewhere. 2423 + * 2424 + * Return: The original value of @v. 2425 + */ 3956 2426 static __always_inline s64 3957 2427 raw_atomic64_fetch_xor_release(s64 i, atomic64_t *v) 3958 2428 { ··· 3979 2427 #endif 3980 2428 } 3981 2429 2430 + /** 2431 + * raw_atomic64_fetch_xor_relaxed() - atomic bitwise XOR with relaxed ordering 2432 + * @i: s64 value 2433 + * @v: pointer to atomic64_t 2434 + * 2435 + * Atomically updates @v to (@v ^ @i) with relaxed ordering. 2436 + * 2437 + * Safe to use in noinstr code; prefer atomic64_fetch_xor_relaxed() elsewhere. 2438 + * 2439 + * Return: The original value of @v. 2440 + */ 3982 2441 static __always_inline s64 3983 2442 raw_atomic64_fetch_xor_relaxed(s64 i, atomic64_t *v) 3984 2443 { ··· 4002 2439 #endif 4003 2440 } 4004 2441 2442 + /** 2443 + * raw_atomic64_xchg() - atomic exchange with full ordering 2444 + * @v: pointer to atomic64_t 2445 + * @new: s64 value to assign 2446 + * 2447 + * Atomically updates @v to @new with full ordering. 2448 + * 2449 + * Safe to use in noinstr code; prefer atomic64_xchg() elsewhere. 2450 + * 2451 + * Return: The original value of @v. 2452 + */ 4005 2453 static __always_inline s64 4006 2454 raw_atomic64_xchg(atomic64_t *v, s64 new) 4007 2455 { ··· 4029 2455 #endif 4030 2456 } 4031 2457 2458 + /** 2459 + * raw_atomic64_xchg_acquire() - atomic exchange with acquire ordering 2460 + * @v: pointer to atomic64_t 2461 + * @new: s64 value to assign 2462 + * 2463 + * Atomically updates @v to @new with acquire ordering. 2464 + * 2465 + * Safe to use in noinstr code; prefer atomic64_xchg_acquire() elsewhere. 2466 + * 2467 + * Return: The original value of @v. 2468 + */ 4032 2469 static __always_inline s64 4033 2470 raw_atomic64_xchg_acquire(atomic64_t *v, s64 new) 4034 2471 { ··· 4056 2471 #endif 4057 2472 } 4058 2473 2474 + /** 2475 + * raw_atomic64_xchg_release() - atomic exchange with release ordering 2476 + * @v: pointer to atomic64_t 2477 + * @new: s64 value to assign 2478 + * 2479 + * Atomically updates @v to @new with release ordering. 2480 + * 2481 + * Safe to use in noinstr code; prefer atomic64_xchg_release() elsewhere. 2482 + * 2483 + * Return: The original value of @v. 2484 + */ 4059 2485 static __always_inline s64 4060 2486 raw_atomic64_xchg_release(atomic64_t *v, s64 new) 4061 2487 { ··· 4082 2486 #endif 4083 2487 } 4084 2488 2489 + /** 2490 + * raw_atomic64_xchg_relaxed() - atomic exchange with relaxed ordering 2491 + * @v: pointer to atomic64_t 2492 + * @new: s64 value to assign 2493 + * 2494 + * Atomically updates @v to @new with relaxed ordering. 2495 + * 2496 + * Safe to use in noinstr code; prefer atomic64_xchg_relaxed() elsewhere. 2497 + * 2498 + * Return: The original value of @v. 2499 + */ 4085 2500 static __always_inline s64 4086 2501 raw_atomic64_xchg_relaxed(atomic64_t *v, s64 new) 4087 2502 { ··· 4105 2498 #endif 4106 2499 } 4107 2500 2501 + /** 2502 + * raw_atomic64_cmpxchg() - atomic compare and exchange with full ordering 2503 + * @v: pointer to atomic64_t 2504 + * @old: s64 value to compare with 2505 + * @new: s64 value to assign 2506 + * 2507 + * If (@v == @old), atomically updates @v to @new with full ordering. 2508 + * 2509 + * Safe to use in noinstr code; prefer atomic64_cmpxchg() elsewhere. 2510 + * 2511 + * Return: The original value of @v. 2512 + */ 4108 2513 static __always_inline s64 4109 2514 raw_atomic64_cmpxchg(atomic64_t *v, s64 old, s64 new) 4110 2515 { ··· 4133 2514 #endif 4134 2515 } 4135 2516 2517 + /** 2518 + * raw_atomic64_cmpxchg_acquire() - atomic compare and exchange with acquire ordering 2519 + * @v: pointer to atomic64_t 2520 + * @old: s64 value to compare with 2521 + * @new: s64 value to assign 2522 + * 2523 + * If (@v == @old), atomically updates @v to @new with acquire ordering. 2524 + * 2525 + * Safe to use in noinstr code; prefer atomic64_cmpxchg_acquire() elsewhere. 2526 + * 2527 + * Return: The original value of @v. 2528 + */ 4136 2529 static __always_inline s64 4137 2530 raw_atomic64_cmpxchg_acquire(atomic64_t *v, s64 old, s64 new) 4138 2531 { ··· 4161 2530 #endif 4162 2531 } 4163 2532 2533 + /** 2534 + * raw_atomic64_cmpxchg_release() - atomic compare and exchange with release ordering 2535 + * @v: pointer to atomic64_t 2536 + * @old: s64 value to compare with 2537 + * @new: s64 value to assign 2538 + * 2539 + * If (@v == @old), atomically updates @v to @new with release ordering. 2540 + * 2541 + * Safe to use in noinstr code; prefer atomic64_cmpxchg_release() elsewhere. 2542 + * 2543 + * Return: The original value of @v. 2544 + */ 4164 2545 static __always_inline s64 4165 2546 raw_atomic64_cmpxchg_release(atomic64_t *v, s64 old, s64 new) 4166 2547 { ··· 4188 2545 #endif 4189 2546 } 4190 2547 2548 + /** 2549 + * raw_atomic64_cmpxchg_relaxed() - atomic compare and exchange with relaxed ordering 2550 + * @v: pointer to atomic64_t 2551 + * @old: s64 value to compare with 2552 + * @new: s64 value to assign 2553 + * 2554 + * If (@v == @old), atomically updates @v to @new with relaxed ordering. 2555 + * 2556 + * Safe to use in noinstr code; prefer atomic64_cmpxchg_relaxed() elsewhere. 2557 + * 2558 + * Return: The original value of @v. 2559 + */ 4191 2560 static __always_inline s64 4192 2561 raw_atomic64_cmpxchg_relaxed(atomic64_t *v, s64 old, s64 new) 4193 2562 { ··· 4212 2557 #endif 4213 2558 } 4214 2559 2560 + /** 2561 + * raw_atomic64_try_cmpxchg() - atomic compare and exchange with full ordering 2562 + * @v: pointer to atomic64_t 2563 + * @old: pointer to s64 value to compare with 2564 + * @new: s64 value to assign 2565 + * 2566 + * If (@v == @old), atomically updates @v to @new with full ordering. 2567 + * Otherwise, updates @old to the current value of @v. 2568 + * 2569 + * Safe to use in noinstr code; prefer atomic64_try_cmpxchg() elsewhere. 2570 + * 2571 + * Return: @true if the exchange occured, @false otherwise. 2572 + */ 4215 2573 static __always_inline bool 4216 2574 raw_atomic64_try_cmpxchg(atomic64_t *v, s64 *old, s64 new) 4217 2575 { ··· 4245 2577 #endif 4246 2578 } 4247 2579 2580 + /** 2581 + * raw_atomic64_try_cmpxchg_acquire() - atomic compare and exchange with acquire ordering 2582 + * @v: pointer to atomic64_t 2583 + * @old: pointer to s64 value to compare with 2584 + * @new: s64 value to assign 2585 + * 2586 + * If (@v == @old), atomically updates @v to @new with acquire ordering. 2587 + * Otherwise, updates @old to the current value of @v. 2588 + * 2589 + * Safe to use in noinstr code; prefer atomic64_try_cmpxchg_acquire() elsewhere. 2590 + * 2591 + * Return: @true if the exchange occured, @false otherwise. 2592 + */ 4248 2593 static __always_inline bool 4249 2594 raw_atomic64_try_cmpxchg_acquire(atomic64_t *v, s64 *old, s64 new) 4250 2595 { ··· 4278 2597 #endif 4279 2598 } 4280 2599 2600 + /** 2601 + * raw_atomic64_try_cmpxchg_release() - atomic compare and exchange with release ordering 2602 + * @v: pointer to atomic64_t 2603 + * @old: pointer to s64 value to compare with 2604 + * @new: s64 value to assign 2605 + * 2606 + * If (@v == @old), atomically updates @v to @new with release ordering. 2607 + * Otherwise, updates @old to the current value of @v. 2608 + * 2609 + * Safe to use in noinstr code; prefer atomic64_try_cmpxchg_release() elsewhere. 2610 + * 2611 + * Return: @true if the exchange occured, @false otherwise. 2612 + */ 4281 2613 static __always_inline bool 4282 2614 raw_atomic64_try_cmpxchg_release(atomic64_t *v, s64 *old, s64 new) 4283 2615 { ··· 4310 2616 #endif 4311 2617 } 4312 2618 2619 + /** 2620 + * raw_atomic64_try_cmpxchg_relaxed() - atomic compare and exchange with relaxed ordering 2621 + * @v: pointer to atomic64_t 2622 + * @old: pointer to s64 value to compare with 2623 + * @new: s64 value to assign 2624 + * 2625 + * If (@v == @old), atomically updates @v to @new with relaxed ordering. 2626 + * Otherwise, updates @old to the current value of @v. 2627 + * 2628 + * Safe to use in noinstr code; prefer atomic64_try_cmpxchg_relaxed() elsewhere. 2629 + * 2630 + * Return: @true if the exchange occured, @false otherwise. 2631 + */ 4313 2632 static __always_inline bool 4314 2633 raw_atomic64_try_cmpxchg_relaxed(atomic64_t *v, s64 *old, s64 new) 4315 2634 { ··· 4339 2632 #endif 4340 2633 } 4341 2634 2635 + /** 2636 + * raw_atomic64_sub_and_test() - atomic subtract and test if zero with full ordering 2637 + * @i: s64 value to add 2638 + * @v: pointer to atomic64_t 2639 + * 2640 + * Atomically updates @v to (@v - @i) with full ordering. 2641 + * 2642 + * Safe to use in noinstr code; prefer atomic64_sub_and_test() elsewhere. 2643 + * 2644 + * Return: @true if the resulting value of @v is zero, @false otherwise. 2645 + */ 4342 2646 static __always_inline bool 4343 2647 raw_atomic64_sub_and_test(s64 i, atomic64_t *v) 4344 2648 { ··· 4360 2642 #endif 4361 2643 } 4362 2644 2645 + /** 2646 + * raw_atomic64_dec_and_test() - atomic decrement and test if zero with full ordering 2647 + * @v: pointer to atomic64_t 2648 + * 2649 + * Atomically updates @v to (@v - 1) with full ordering. 2650 + * 2651 + * Safe to use in noinstr code; prefer atomic64_dec_and_test() elsewhere. 2652 + * 2653 + * Return: @true if the resulting value of @v is zero, @false otherwise. 2654 + */ 4363 2655 static __always_inline bool 4364 2656 raw_atomic64_dec_and_test(atomic64_t *v) 4365 2657 { ··· 4380 2652 #endif 4381 2653 } 4382 2654 2655 + /** 2656 + * raw_atomic64_inc_and_test() - atomic increment and test if zero with full ordering 2657 + * @v: pointer to atomic64_t 2658 + * 2659 + * Atomically updates @v to (@v + 1) with full ordering. 2660 + * 2661 + * Safe to use in noinstr code; prefer atomic64_inc_and_test() elsewhere. 2662 + * 2663 + * Return: @true if the resulting value of @v is zero, @false otherwise. 2664 + */ 4383 2665 static __always_inline bool 4384 2666 raw_atomic64_inc_and_test(atomic64_t *v) 4385 2667 { ··· 4400 2662 #endif 4401 2663 } 4402 2664 2665 + /** 2666 + * raw_atomic64_add_negative() - atomic add and test if negative with full ordering 2667 + * @i: s64 value to add 2668 + * @v: pointer to atomic64_t 2669 + * 2670 + * Atomically updates @v to (@v + @i) with full ordering. 2671 + * 2672 + * Safe to use in noinstr code; prefer atomic64_add_negative() elsewhere. 2673 + * 2674 + * Return: @true if the resulting value of @v is negative, @false otherwise. 2675 + */ 4403 2676 static __always_inline bool 4404 2677 raw_atomic64_add_negative(s64 i, atomic64_t *v) 4405 2678 { ··· 4427 2678 #endif 4428 2679 } 4429 2680 2681 + /** 2682 + * raw_atomic64_add_negative_acquire() - atomic add and test if negative with acquire ordering 2683 + * @i: s64 value to add 2684 + * @v: pointer to atomic64_t 2685 + * 2686 + * Atomically updates @v to (@v + @i) with acquire ordering. 2687 + * 2688 + * Safe to use in noinstr code; prefer atomic64_add_negative_acquire() elsewhere. 2689 + * 2690 + * Return: @true if the resulting value of @v is negative, @false otherwise. 2691 + */ 4430 2692 static __always_inline bool 4431 2693 raw_atomic64_add_negative_acquire(s64 i, atomic64_t *v) 4432 2694 { ··· 4454 2694 #endif 4455 2695 } 4456 2696 2697 + /** 2698 + * raw_atomic64_add_negative_release() - atomic add and test if negative with release ordering 2699 + * @i: s64 value to add 2700 + * @v: pointer to atomic64_t 2701 + * 2702 + * Atomically updates @v to (@v + @i) with release ordering. 2703 + * 2704 + * Safe to use in noinstr code; prefer atomic64_add_negative_release() elsewhere. 2705 + * 2706 + * Return: @true if the resulting value of @v is negative, @false otherwise. 2707 + */ 4457 2708 static __always_inline bool 4458 2709 raw_atomic64_add_negative_release(s64 i, atomic64_t *v) 4459 2710 { ··· 4480 2709 #endif 4481 2710 } 4482 2711 2712 + /** 2713 + * raw_atomic64_add_negative_relaxed() - atomic add and test if negative with relaxed ordering 2714 + * @i: s64 value to add 2715 + * @v: pointer to atomic64_t 2716 + * 2717 + * Atomically updates @v to (@v + @i) with relaxed ordering. 2718 + * 2719 + * Safe to use in noinstr code; prefer atomic64_add_negative_relaxed() elsewhere. 2720 + * 2721 + * Return: @true if the resulting value of @v is negative, @false otherwise. 2722 + */ 4483 2723 static __always_inline bool 4484 2724 raw_atomic64_add_negative_relaxed(s64 i, atomic64_t *v) 4485 2725 { ··· 4503 2721 #endif 4504 2722 } 4505 2723 2724 + /** 2725 + * raw_atomic64_fetch_add_unless() - atomic add unless value with full ordering 2726 + * @v: pointer to atomic64_t 2727 + * @a: s64 value to add 2728 + * @u: s64 value to compare with 2729 + * 2730 + * If (@v != @u), atomically updates @v to (@v + @a) with full ordering. 2731 + * 2732 + * Safe to use in noinstr code; prefer atomic64_fetch_add_unless() elsewhere. 2733 + * 2734 + * Return: The original value of @v. 2735 + */ 4506 2736 static __always_inline s64 4507 2737 raw_atomic64_fetch_add_unless(atomic64_t *v, s64 a, s64 u) 4508 2738 { ··· 4532 2738 #endif 4533 2739 } 4534 2740 2741 + /** 2742 + * raw_atomic64_add_unless() - atomic add unless value with full ordering 2743 + * @v: pointer to atomic64_t 2744 + * @a: s64 value to add 2745 + * @u: s64 value to compare with 2746 + * 2747 + * If (@v != @u), atomically updates @v to (@v + @a) with full ordering. 2748 + * 2749 + * Safe to use in noinstr code; prefer atomic64_add_unless() elsewhere. 2750 + * 2751 + * Return: @true if @v was updated, @false otherwise. 2752 + */ 4535 2753 static __always_inline bool 4536 2754 raw_atomic64_add_unless(atomic64_t *v, s64 a, s64 u) 4537 2755 { ··· 4554 2748 #endif 4555 2749 } 4556 2750 2751 + /** 2752 + * raw_atomic64_inc_not_zero() - atomic increment unless zero with full ordering 2753 + * @v: pointer to atomic64_t 2754 + * 2755 + * If (@v != 0), atomically updates @v to (@v + 1) with full ordering. 2756 + * 2757 + * Safe to use in noinstr code; prefer atomic64_inc_not_zero() elsewhere. 2758 + * 2759 + * Return: @true if @v was updated, @false otherwise. 2760 + */ 4557 2761 static __always_inline bool 4558 2762 raw_atomic64_inc_not_zero(atomic64_t *v) 4559 2763 { ··· 4574 2758 #endif 4575 2759 } 4576 2760 2761 + /** 2762 + * raw_atomic64_inc_unless_negative() - atomic increment unless negative with full ordering 2763 + * @v: pointer to atomic64_t 2764 + * 2765 + * If (@v >= 0), atomically updates @v to (@v + 1) with full ordering. 2766 + * 2767 + * Safe to use in noinstr code; prefer atomic64_inc_unless_negative() elsewhere. 2768 + * 2769 + * Return: @true if @v was updated, @false otherwise. 2770 + */ 4577 2771 static __always_inline bool 4578 2772 raw_atomic64_inc_unless_negative(atomic64_t *v) 4579 2773 { ··· 4601 2775 #endif 4602 2776 } 4603 2777 2778 + /** 2779 + * raw_atomic64_dec_unless_positive() - atomic decrement unless positive with full ordering 2780 + * @v: pointer to atomic64_t 2781 + * 2782 + * If (@v <= 0), atomically updates @v to (@v - 1) with full ordering. 2783 + * 2784 + * Safe to use in noinstr code; prefer atomic64_dec_unless_positive() elsewhere. 2785 + * 2786 + * Return: @true if @v was updated, @false otherwise. 2787 + */ 4604 2788 static __always_inline bool 4605 2789 raw_atomic64_dec_unless_positive(atomic64_t *v) 4606 2790 { ··· 4628 2792 #endif 4629 2793 } 4630 2794 2795 + /** 2796 + * raw_atomic64_dec_if_positive() - atomic decrement if positive with full ordering 2797 + * @v: pointer to atomic64_t 2798 + * 2799 + * If (@v > 0), atomically updates @v to (@v - 1) with full ordering. 2800 + * 2801 + * Safe to use in noinstr code; prefer atomic64_dec_if_positive() elsewhere. 2802 + * 2803 + * Return: @true if @v was updated, @false otherwise. 2804 + */ 4631 2805 static __always_inline s64 4632 2806 raw_atomic64_dec_if_positive(atomic64_t *v) 4633 2807 { ··· 4657 2811 } 4658 2812 4659 2813 #endif /* _LINUX_ATOMIC_FALLBACK_H */ 4660 - // 205e090382132f1fc85e48b46e722865f9c81309 2814 + // 3916f02c038baa3f5190d275f68b9211667fcc9d
+2770 -1
include/linux/atomic/atomic-instrumented.h
··· 16 16 #include <linux/compiler.h> 17 17 #include <linux/instrumented.h> 18 18 19 + /** 20 + * atomic_read() - atomic load with relaxed ordering 21 + * @v: pointer to atomic_t 22 + * 23 + * Atomically loads the value of @v with relaxed ordering. 24 + * 25 + * Unsafe to use in noinstr code; use raw_atomic_read() there. 26 + * 27 + * Return: The value loaded from @v. 28 + */ 19 29 static __always_inline int 20 30 atomic_read(const atomic_t *v) 21 31 { ··· 33 23 return raw_atomic_read(v); 34 24 } 35 25 26 + /** 27 + * atomic_read_acquire() - atomic load with acquire ordering 28 + * @v: pointer to atomic_t 29 + * 30 + * Atomically loads the value of @v with acquire ordering. 31 + * 32 + * Unsafe to use in noinstr code; use raw_atomic_read_acquire() there. 33 + * 34 + * Return: The value loaded from @v. 35 + */ 36 36 static __always_inline int 37 37 atomic_read_acquire(const atomic_t *v) 38 38 { ··· 50 30 return raw_atomic_read_acquire(v); 51 31 } 52 32 33 + /** 34 + * atomic_set() - atomic set with relaxed ordering 35 + * @v: pointer to atomic_t 36 + * @i: int value to assign 37 + * 38 + * Atomically sets @v to @i with relaxed ordering. 39 + * 40 + * Unsafe to use in noinstr code; use raw_atomic_set() there. 41 + * 42 + * Return: Nothing. 43 + */ 53 44 static __always_inline void 54 45 atomic_set(atomic_t *v, int i) 55 46 { ··· 68 37 raw_atomic_set(v, i); 69 38 } 70 39 40 + /** 41 + * atomic_set_release() - atomic set with release ordering 42 + * @v: pointer to atomic_t 43 + * @i: int value to assign 44 + * 45 + * Atomically sets @v to @i with release ordering. 46 + * 47 + * Unsafe to use in noinstr code; use raw_atomic_set_release() there. 48 + * 49 + * Return: Nothing. 50 + */ 71 51 static __always_inline void 72 52 atomic_set_release(atomic_t *v, int i) 73 53 { ··· 87 45 raw_atomic_set_release(v, i); 88 46 } 89 47 48 + /** 49 + * atomic_add() - atomic add with relaxed ordering 50 + * @i: int value to add 51 + * @v: pointer to atomic_t 52 + * 53 + * Atomically updates @v to (@v + @i) with relaxed ordering. 54 + * 55 + * Unsafe to use in noinstr code; use raw_atomic_add() there. 56 + * 57 + * Return: Nothing. 58 + */ 90 59 static __always_inline void 91 60 atomic_add(int i, atomic_t *v) 92 61 { ··· 105 52 raw_atomic_add(i, v); 106 53 } 107 54 55 + /** 56 + * atomic_add_return() - atomic add with full ordering 57 + * @i: int value to add 58 + * @v: pointer to atomic_t 59 + * 60 + * Atomically updates @v to (@v + @i) with full ordering. 61 + * 62 + * Unsafe to use in noinstr code; use raw_atomic_add_return() there. 63 + * 64 + * Return: The updated value of @v. 65 + */ 108 66 static __always_inline int 109 67 atomic_add_return(int i, atomic_t *v) 110 68 { ··· 124 60 return raw_atomic_add_return(i, v); 125 61 } 126 62 63 + /** 64 + * atomic_add_return_acquire() - atomic add with acquire ordering 65 + * @i: int value to add 66 + * @v: pointer to atomic_t 67 + * 68 + * Atomically updates @v to (@v + @i) with acquire ordering. 69 + * 70 + * Unsafe to use in noinstr code; use raw_atomic_add_return_acquire() there. 71 + * 72 + * Return: The updated value of @v. 73 + */ 127 74 static __always_inline int 128 75 atomic_add_return_acquire(int i, atomic_t *v) 129 76 { ··· 142 67 return raw_atomic_add_return_acquire(i, v); 143 68 } 144 69 70 + /** 71 + * atomic_add_return_release() - atomic add with release ordering 72 + * @i: int value to add 73 + * @v: pointer to atomic_t 74 + * 75 + * Atomically updates @v to (@v + @i) with release ordering. 76 + * 77 + * Unsafe to use in noinstr code; use raw_atomic_add_return_release() there. 78 + * 79 + * Return: The updated value of @v. 80 + */ 145 81 static __always_inline int 146 82 atomic_add_return_release(int i, atomic_t *v) 147 83 { ··· 161 75 return raw_atomic_add_return_release(i, v); 162 76 } 163 77 78 + /** 79 + * atomic_add_return_relaxed() - atomic add with relaxed ordering 80 + * @i: int value to add 81 + * @v: pointer to atomic_t 82 + * 83 + * Atomically updates @v to (@v + @i) with relaxed ordering. 84 + * 85 + * Unsafe to use in noinstr code; use raw_atomic_add_return_relaxed() there. 86 + * 87 + * Return: The updated value of @v. 88 + */ 164 89 static __always_inline int 165 90 atomic_add_return_relaxed(int i, atomic_t *v) 166 91 { ··· 179 82 return raw_atomic_add_return_relaxed(i, v); 180 83 } 181 84 85 + /** 86 + * atomic_fetch_add() - atomic add with full ordering 87 + * @i: int value to add 88 + * @v: pointer to atomic_t 89 + * 90 + * Atomically updates @v to (@v + @i) with full ordering. 91 + * 92 + * Unsafe to use in noinstr code; use raw_atomic_fetch_add() there. 93 + * 94 + * Return: The original value of @v. 95 + */ 182 96 static __always_inline int 183 97 atomic_fetch_add(int i, atomic_t *v) 184 98 { ··· 198 90 return raw_atomic_fetch_add(i, v); 199 91 } 200 92 93 + /** 94 + * atomic_fetch_add_acquire() - atomic add with acquire ordering 95 + * @i: int value to add 96 + * @v: pointer to atomic_t 97 + * 98 + * Atomically updates @v to (@v + @i) with acquire ordering. 99 + * 100 + * Unsafe to use in noinstr code; use raw_atomic_fetch_add_acquire() there. 101 + * 102 + * Return: The original value of @v. 103 + */ 201 104 static __always_inline int 202 105 atomic_fetch_add_acquire(int i, atomic_t *v) 203 106 { ··· 216 97 return raw_atomic_fetch_add_acquire(i, v); 217 98 } 218 99 100 + /** 101 + * atomic_fetch_add_release() - atomic add with release ordering 102 + * @i: int value to add 103 + * @v: pointer to atomic_t 104 + * 105 + * Atomically updates @v to (@v + @i) with release ordering. 106 + * 107 + * Unsafe to use in noinstr code; use raw_atomic_fetch_add_release() there. 108 + * 109 + * Return: The original value of @v. 110 + */ 219 111 static __always_inline int 220 112 atomic_fetch_add_release(int i, atomic_t *v) 221 113 { ··· 235 105 return raw_atomic_fetch_add_release(i, v); 236 106 } 237 107 108 + /** 109 + * atomic_fetch_add_relaxed() - atomic add with relaxed ordering 110 + * @i: int value to add 111 + * @v: pointer to atomic_t 112 + * 113 + * Atomically updates @v to (@v + @i) with relaxed ordering. 114 + * 115 + * Unsafe to use in noinstr code; use raw_atomic_fetch_add_relaxed() there. 116 + * 117 + * Return: The original value of @v. 118 + */ 238 119 static __always_inline int 239 120 atomic_fetch_add_relaxed(int i, atomic_t *v) 240 121 { ··· 253 112 return raw_atomic_fetch_add_relaxed(i, v); 254 113 } 255 114 115 + /** 116 + * atomic_sub() - atomic subtract with relaxed ordering 117 + * @i: int value to subtract 118 + * @v: pointer to atomic_t 119 + * 120 + * Atomically updates @v to (@v - @i) with relaxed ordering. 121 + * 122 + * Unsafe to use in noinstr code; use raw_atomic_sub() there. 123 + * 124 + * Return: Nothing. 125 + */ 256 126 static __always_inline void 257 127 atomic_sub(int i, atomic_t *v) 258 128 { ··· 271 119 raw_atomic_sub(i, v); 272 120 } 273 121 122 + /** 123 + * atomic_sub_return() - atomic subtract with full ordering 124 + * @i: int value to subtract 125 + * @v: pointer to atomic_t 126 + * 127 + * Atomically updates @v to (@v - @i) with full ordering. 128 + * 129 + * Unsafe to use in noinstr code; use raw_atomic_sub_return() there. 130 + * 131 + * Return: The updated value of @v. 132 + */ 274 133 static __always_inline int 275 134 atomic_sub_return(int i, atomic_t *v) 276 135 { ··· 290 127 return raw_atomic_sub_return(i, v); 291 128 } 292 129 130 + /** 131 + * atomic_sub_return_acquire() - atomic subtract with acquire ordering 132 + * @i: int value to subtract 133 + * @v: pointer to atomic_t 134 + * 135 + * Atomically updates @v to (@v - @i) with acquire ordering. 136 + * 137 + * Unsafe to use in noinstr code; use raw_atomic_sub_return_acquire() there. 138 + * 139 + * Return: The updated value of @v. 140 + */ 293 141 static __always_inline int 294 142 atomic_sub_return_acquire(int i, atomic_t *v) 295 143 { ··· 308 134 return raw_atomic_sub_return_acquire(i, v); 309 135 } 310 136 137 + /** 138 + * atomic_sub_return_release() - atomic subtract with release ordering 139 + * @i: int value to subtract 140 + * @v: pointer to atomic_t 141 + * 142 + * Atomically updates @v to (@v - @i) with release ordering. 143 + * 144 + * Unsafe to use in noinstr code; use raw_atomic_sub_return_release() there. 145 + * 146 + * Return: The updated value of @v. 147 + */ 311 148 static __always_inline int 312 149 atomic_sub_return_release(int i, atomic_t *v) 313 150 { ··· 327 142 return raw_atomic_sub_return_release(i, v); 328 143 } 329 144 145 + /** 146 + * atomic_sub_return_relaxed() - atomic subtract with relaxed ordering 147 + * @i: int value to subtract 148 + * @v: pointer to atomic_t 149 + * 150 + * Atomically updates @v to (@v - @i) with relaxed ordering. 151 + * 152 + * Unsafe to use in noinstr code; use raw_atomic_sub_return_relaxed() there. 153 + * 154 + * Return: The updated value of @v. 155 + */ 330 156 static __always_inline int 331 157 atomic_sub_return_relaxed(int i, atomic_t *v) 332 158 { ··· 345 149 return raw_atomic_sub_return_relaxed(i, v); 346 150 } 347 151 152 + /** 153 + * atomic_fetch_sub() - atomic subtract with full ordering 154 + * @i: int value to subtract 155 + * @v: pointer to atomic_t 156 + * 157 + * Atomically updates @v to (@v - @i) with full ordering. 158 + * 159 + * Unsafe to use in noinstr code; use raw_atomic_fetch_sub() there. 160 + * 161 + * Return: The original value of @v. 162 + */ 348 163 static __always_inline int 349 164 atomic_fetch_sub(int i, atomic_t *v) 350 165 { ··· 364 157 return raw_atomic_fetch_sub(i, v); 365 158 } 366 159 160 + /** 161 + * atomic_fetch_sub_acquire() - atomic subtract with acquire ordering 162 + * @i: int value to subtract 163 + * @v: pointer to atomic_t 164 + * 165 + * Atomically updates @v to (@v - @i) with acquire ordering. 166 + * 167 + * Unsafe to use in noinstr code; use raw_atomic_fetch_sub_acquire() there. 168 + * 169 + * Return: The original value of @v. 170 + */ 367 171 static __always_inline int 368 172 atomic_fetch_sub_acquire(int i, atomic_t *v) 369 173 { ··· 382 164 return raw_atomic_fetch_sub_acquire(i, v); 383 165 } 384 166 167 + /** 168 + * atomic_fetch_sub_release() - atomic subtract with release ordering 169 + * @i: int value to subtract 170 + * @v: pointer to atomic_t 171 + * 172 + * Atomically updates @v to (@v - @i) with release ordering. 173 + * 174 + * Unsafe to use in noinstr code; use raw_atomic_fetch_sub_release() there. 175 + * 176 + * Return: The original value of @v. 177 + */ 385 178 static __always_inline int 386 179 atomic_fetch_sub_release(int i, atomic_t *v) 387 180 { ··· 401 172 return raw_atomic_fetch_sub_release(i, v); 402 173 } 403 174 175 + /** 176 + * atomic_fetch_sub_relaxed() - atomic subtract with relaxed ordering 177 + * @i: int value to subtract 178 + * @v: pointer to atomic_t 179 + * 180 + * Atomically updates @v to (@v - @i) with relaxed ordering. 181 + * 182 + * Unsafe to use in noinstr code; use raw_atomic_fetch_sub_relaxed() there. 183 + * 184 + * Return: The original value of @v. 185 + */ 404 186 static __always_inline int 405 187 atomic_fetch_sub_relaxed(int i, atomic_t *v) 406 188 { ··· 419 179 return raw_atomic_fetch_sub_relaxed(i, v); 420 180 } 421 181 182 + /** 183 + * atomic_inc() - atomic increment with relaxed ordering 184 + * @v: pointer to atomic_t 185 + * 186 + * Atomically updates @v to (@v + 1) with relaxed ordering. 187 + * 188 + * Unsafe to use in noinstr code; use raw_atomic_inc() there. 189 + * 190 + * Return: Nothing. 191 + */ 422 192 static __always_inline void 423 193 atomic_inc(atomic_t *v) 424 194 { ··· 436 186 raw_atomic_inc(v); 437 187 } 438 188 189 + /** 190 + * atomic_inc_return() - atomic increment with full ordering 191 + * @v: pointer to atomic_t 192 + * 193 + * Atomically updates @v to (@v + 1) with full ordering. 194 + * 195 + * Unsafe to use in noinstr code; use raw_atomic_inc_return() there. 196 + * 197 + * Return: The updated value of @v. 198 + */ 439 199 static __always_inline int 440 200 atomic_inc_return(atomic_t *v) 441 201 { ··· 454 194 return raw_atomic_inc_return(v); 455 195 } 456 196 197 + /** 198 + * atomic_inc_return_acquire() - atomic increment with acquire ordering 199 + * @v: pointer to atomic_t 200 + * 201 + * Atomically updates @v to (@v + 1) with acquire ordering. 202 + * 203 + * Unsafe to use in noinstr code; use raw_atomic_inc_return_acquire() there. 204 + * 205 + * Return: The updated value of @v. 206 + */ 457 207 static __always_inline int 458 208 atomic_inc_return_acquire(atomic_t *v) 459 209 { ··· 471 201 return raw_atomic_inc_return_acquire(v); 472 202 } 473 203 204 + /** 205 + * atomic_inc_return_release() - atomic increment with release ordering 206 + * @v: pointer to atomic_t 207 + * 208 + * Atomically updates @v to (@v + 1) with release ordering. 209 + * 210 + * Unsafe to use in noinstr code; use raw_atomic_inc_return_release() there. 211 + * 212 + * Return: The updated value of @v. 213 + */ 474 214 static __always_inline int 475 215 atomic_inc_return_release(atomic_t *v) 476 216 { ··· 489 209 return raw_atomic_inc_return_release(v); 490 210 } 491 211 212 + /** 213 + * atomic_inc_return_relaxed() - atomic increment with relaxed ordering 214 + * @v: pointer to atomic_t 215 + * 216 + * Atomically updates @v to (@v + 1) with relaxed ordering. 217 + * 218 + * Unsafe to use in noinstr code; use raw_atomic_inc_return_relaxed() there. 219 + * 220 + * Return: The updated value of @v. 221 + */ 492 222 static __always_inline int 493 223 atomic_inc_return_relaxed(atomic_t *v) 494 224 { ··· 506 216 return raw_atomic_inc_return_relaxed(v); 507 217 } 508 218 219 + /** 220 + * atomic_fetch_inc() - atomic increment with full ordering 221 + * @v: pointer to atomic_t 222 + * 223 + * Atomically updates @v to (@v + 1) with full ordering. 224 + * 225 + * Unsafe to use in noinstr code; use raw_atomic_fetch_inc() there. 226 + * 227 + * Return: The original value of @v. 228 + */ 509 229 static __always_inline int 510 230 atomic_fetch_inc(atomic_t *v) 511 231 { ··· 524 224 return raw_atomic_fetch_inc(v); 525 225 } 526 226 227 + /** 228 + * atomic_fetch_inc_acquire() - atomic increment with acquire ordering 229 + * @v: pointer to atomic_t 230 + * 231 + * Atomically updates @v to (@v + 1) with acquire ordering. 232 + * 233 + * Unsafe to use in noinstr code; use raw_atomic_fetch_inc_acquire() there. 234 + * 235 + * Return: The original value of @v. 236 + */ 527 237 static __always_inline int 528 238 atomic_fetch_inc_acquire(atomic_t *v) 529 239 { ··· 541 231 return raw_atomic_fetch_inc_acquire(v); 542 232 } 543 233 234 + /** 235 + * atomic_fetch_inc_release() - atomic increment with release ordering 236 + * @v: pointer to atomic_t 237 + * 238 + * Atomically updates @v to (@v + 1) with release ordering. 239 + * 240 + * Unsafe to use in noinstr code; use raw_atomic_fetch_inc_release() there. 241 + * 242 + * Return: The original value of @v. 243 + */ 544 244 static __always_inline int 545 245 atomic_fetch_inc_release(atomic_t *v) 546 246 { ··· 559 239 return raw_atomic_fetch_inc_release(v); 560 240 } 561 241 242 + /** 243 + * atomic_fetch_inc_relaxed() - atomic increment with relaxed ordering 244 + * @v: pointer to atomic_t 245 + * 246 + * Atomically updates @v to (@v + 1) with relaxed ordering. 247 + * 248 + * Unsafe to use in noinstr code; use raw_atomic_fetch_inc_relaxed() there. 249 + * 250 + * Return: The original value of @v. 251 + */ 562 252 static __always_inline int 563 253 atomic_fetch_inc_relaxed(atomic_t *v) 564 254 { ··· 576 246 return raw_atomic_fetch_inc_relaxed(v); 577 247 } 578 248 249 + /** 250 + * atomic_dec() - atomic decrement with relaxed ordering 251 + * @v: pointer to atomic_t 252 + * 253 + * Atomically updates @v to (@v - 1) with relaxed ordering. 254 + * 255 + * Unsafe to use in noinstr code; use raw_atomic_dec() there. 256 + * 257 + * Return: Nothing. 258 + */ 579 259 static __always_inline void 580 260 atomic_dec(atomic_t *v) 581 261 { ··· 593 253 raw_atomic_dec(v); 594 254 } 595 255 256 + /** 257 + * atomic_dec_return() - atomic decrement with full ordering 258 + * @v: pointer to atomic_t 259 + * 260 + * Atomically updates @v to (@v - 1) with full ordering. 261 + * 262 + * Unsafe to use in noinstr code; use raw_atomic_dec_return() there. 263 + * 264 + * Return: The updated value of @v. 265 + */ 596 266 static __always_inline int 597 267 atomic_dec_return(atomic_t *v) 598 268 { ··· 611 261 return raw_atomic_dec_return(v); 612 262 } 613 263 264 + /** 265 + * atomic_dec_return_acquire() - atomic decrement with acquire ordering 266 + * @v: pointer to atomic_t 267 + * 268 + * Atomically updates @v to (@v - 1) with acquire ordering. 269 + * 270 + * Unsafe to use in noinstr code; use raw_atomic_dec_return_acquire() there. 271 + * 272 + * Return: The updated value of @v. 273 + */ 614 274 static __always_inline int 615 275 atomic_dec_return_acquire(atomic_t *v) 616 276 { ··· 628 268 return raw_atomic_dec_return_acquire(v); 629 269 } 630 270 271 + /** 272 + * atomic_dec_return_release() - atomic decrement with release ordering 273 + * @v: pointer to atomic_t 274 + * 275 + * Atomically updates @v to (@v - 1) with release ordering. 276 + * 277 + * Unsafe to use in noinstr code; use raw_atomic_dec_return_release() there. 278 + * 279 + * Return: The updated value of @v. 280 + */ 631 281 static __always_inline int 632 282 atomic_dec_return_release(atomic_t *v) 633 283 { ··· 646 276 return raw_atomic_dec_return_release(v); 647 277 } 648 278 279 + /** 280 + * atomic_dec_return_relaxed() - atomic decrement with relaxed ordering 281 + * @v: pointer to atomic_t 282 + * 283 + * Atomically updates @v to (@v - 1) with relaxed ordering. 284 + * 285 + * Unsafe to use in noinstr code; use raw_atomic_dec_return_relaxed() there. 286 + * 287 + * Return: The updated value of @v. 288 + */ 649 289 static __always_inline int 650 290 atomic_dec_return_relaxed(atomic_t *v) 651 291 { ··· 663 283 return raw_atomic_dec_return_relaxed(v); 664 284 } 665 285 286 + /** 287 + * atomic_fetch_dec() - atomic decrement with full ordering 288 + * @v: pointer to atomic_t 289 + * 290 + * Atomically updates @v to (@v - 1) with full ordering. 291 + * 292 + * Unsafe to use in noinstr code; use raw_atomic_fetch_dec() there. 293 + * 294 + * Return: The original value of @v. 295 + */ 666 296 static __always_inline int 667 297 atomic_fetch_dec(atomic_t *v) 668 298 { ··· 681 291 return raw_atomic_fetch_dec(v); 682 292 } 683 293 294 + /** 295 + * atomic_fetch_dec_acquire() - atomic decrement with acquire ordering 296 + * @v: pointer to atomic_t 297 + * 298 + * Atomically updates @v to (@v - 1) with acquire ordering. 299 + * 300 + * Unsafe to use in noinstr code; use raw_atomic_fetch_dec_acquire() there. 301 + * 302 + * Return: The original value of @v. 303 + */ 684 304 static __always_inline int 685 305 atomic_fetch_dec_acquire(atomic_t *v) 686 306 { ··· 698 298 return raw_atomic_fetch_dec_acquire(v); 699 299 } 700 300 301 + /** 302 + * atomic_fetch_dec_release() - atomic decrement with release ordering 303 + * @v: pointer to atomic_t 304 + * 305 + * Atomically updates @v to (@v - 1) with release ordering. 306 + * 307 + * Unsafe to use in noinstr code; use raw_atomic_fetch_dec_release() there. 308 + * 309 + * Return: The original value of @v. 310 + */ 701 311 static __always_inline int 702 312 atomic_fetch_dec_release(atomic_t *v) 703 313 { ··· 716 306 return raw_atomic_fetch_dec_release(v); 717 307 } 718 308 309 + /** 310 + * atomic_fetch_dec_relaxed() - atomic decrement with relaxed ordering 311 + * @v: pointer to atomic_t 312 + * 313 + * Atomically updates @v to (@v - 1) with relaxed ordering. 314 + * 315 + * Unsafe to use in noinstr code; use raw_atomic_fetch_dec_relaxed() there. 316 + * 317 + * Return: The original value of @v. 318 + */ 719 319 static __always_inline int 720 320 atomic_fetch_dec_relaxed(atomic_t *v) 721 321 { ··· 733 313 return raw_atomic_fetch_dec_relaxed(v); 734 314 } 735 315 316 + /** 317 + * atomic_and() - atomic bitwise AND with relaxed ordering 318 + * @i: int value 319 + * @v: pointer to atomic_t 320 + * 321 + * Atomically updates @v to (@v & @i) with relaxed ordering. 322 + * 323 + * Unsafe to use in noinstr code; use raw_atomic_and() there. 324 + * 325 + * Return: Nothing. 326 + */ 736 327 static __always_inline void 737 328 atomic_and(int i, atomic_t *v) 738 329 { ··· 751 320 raw_atomic_and(i, v); 752 321 } 753 322 323 + /** 324 + * atomic_fetch_and() - atomic bitwise AND with full ordering 325 + * @i: int value 326 + * @v: pointer to atomic_t 327 + * 328 + * Atomically updates @v to (@v & @i) with full ordering. 329 + * 330 + * Unsafe to use in noinstr code; use raw_atomic_fetch_and() there. 331 + * 332 + * Return: The original value of @v. 333 + */ 754 334 static __always_inline int 755 335 atomic_fetch_and(int i, atomic_t *v) 756 336 { ··· 770 328 return raw_atomic_fetch_and(i, v); 771 329 } 772 330 331 + /** 332 + * atomic_fetch_and_acquire() - atomic bitwise AND with acquire ordering 333 + * @i: int value 334 + * @v: pointer to atomic_t 335 + * 336 + * Atomically updates @v to (@v & @i) with acquire ordering. 337 + * 338 + * Unsafe to use in noinstr code; use raw_atomic_fetch_and_acquire() there. 339 + * 340 + * Return: The original value of @v. 341 + */ 773 342 static __always_inline int 774 343 atomic_fetch_and_acquire(int i, atomic_t *v) 775 344 { ··· 788 335 return raw_atomic_fetch_and_acquire(i, v); 789 336 } 790 337 338 + /** 339 + * atomic_fetch_and_release() - atomic bitwise AND with release ordering 340 + * @i: int value 341 + * @v: pointer to atomic_t 342 + * 343 + * Atomically updates @v to (@v & @i) with release ordering. 344 + * 345 + * Unsafe to use in noinstr code; use raw_atomic_fetch_and_release() there. 346 + * 347 + * Return: The original value of @v. 348 + */ 791 349 static __always_inline int 792 350 atomic_fetch_and_release(int i, atomic_t *v) 793 351 { ··· 807 343 return raw_atomic_fetch_and_release(i, v); 808 344 } 809 345 346 + /** 347 + * atomic_fetch_and_relaxed() - atomic bitwise AND with relaxed ordering 348 + * @i: int value 349 + * @v: pointer to atomic_t 350 + * 351 + * Atomically updates @v to (@v & @i) with relaxed ordering. 352 + * 353 + * Unsafe to use in noinstr code; use raw_atomic_fetch_and_relaxed() there. 354 + * 355 + * Return: The original value of @v. 356 + */ 810 357 static __always_inline int 811 358 atomic_fetch_and_relaxed(int i, atomic_t *v) 812 359 { ··· 825 350 return raw_atomic_fetch_and_relaxed(i, v); 826 351 } 827 352 353 + /** 354 + * atomic_andnot() - atomic bitwise AND NOT with relaxed ordering 355 + * @i: int value 356 + * @v: pointer to atomic_t 357 + * 358 + * Atomically updates @v to (@v & ~@i) with relaxed ordering. 359 + * 360 + * Unsafe to use in noinstr code; use raw_atomic_andnot() there. 361 + * 362 + * Return: Nothing. 363 + */ 828 364 static __always_inline void 829 365 atomic_andnot(int i, atomic_t *v) 830 366 { ··· 843 357 raw_atomic_andnot(i, v); 844 358 } 845 359 360 + /** 361 + * atomic_fetch_andnot() - atomic bitwise AND NOT with full ordering 362 + * @i: int value 363 + * @v: pointer to atomic_t 364 + * 365 + * Atomically updates @v to (@v & ~@i) with full ordering. 366 + * 367 + * Unsafe to use in noinstr code; use raw_atomic_fetch_andnot() there. 368 + * 369 + * Return: The original value of @v. 370 + */ 846 371 static __always_inline int 847 372 atomic_fetch_andnot(int i, atomic_t *v) 848 373 { ··· 862 365 return raw_atomic_fetch_andnot(i, v); 863 366 } 864 367 368 + /** 369 + * atomic_fetch_andnot_acquire() - atomic bitwise AND NOT with acquire ordering 370 + * @i: int value 371 + * @v: pointer to atomic_t 372 + * 373 + * Atomically updates @v to (@v & ~@i) with acquire ordering. 374 + * 375 + * Unsafe to use in noinstr code; use raw_atomic_fetch_andnot_acquire() there. 376 + * 377 + * Return: The original value of @v. 378 + */ 865 379 static __always_inline int 866 380 atomic_fetch_andnot_acquire(int i, atomic_t *v) 867 381 { ··· 880 372 return raw_atomic_fetch_andnot_acquire(i, v); 881 373 } 882 374 375 + /** 376 + * atomic_fetch_andnot_release() - atomic bitwise AND NOT with release ordering 377 + * @i: int value 378 + * @v: pointer to atomic_t 379 + * 380 + * Atomically updates @v to (@v & ~@i) with release ordering. 381 + * 382 + * Unsafe to use in noinstr code; use raw_atomic_fetch_andnot_release() there. 383 + * 384 + * Return: The original value of @v. 385 + */ 883 386 static __always_inline int 884 387 atomic_fetch_andnot_release(int i, atomic_t *v) 885 388 { ··· 899 380 return raw_atomic_fetch_andnot_release(i, v); 900 381 } 901 382 383 + /** 384 + * atomic_fetch_andnot_relaxed() - atomic bitwise AND NOT with relaxed ordering 385 + * @i: int value 386 + * @v: pointer to atomic_t 387 + * 388 + * Atomically updates @v to (@v & ~@i) with relaxed ordering. 389 + * 390 + * Unsafe to use in noinstr code; use raw_atomic_fetch_andnot_relaxed() there. 391 + * 392 + * Return: The original value of @v. 393 + */ 902 394 static __always_inline int 903 395 atomic_fetch_andnot_relaxed(int i, atomic_t *v) 904 396 { ··· 917 387 return raw_atomic_fetch_andnot_relaxed(i, v); 918 388 } 919 389 390 + /** 391 + * atomic_or() - atomic bitwise OR with relaxed ordering 392 + * @i: int value 393 + * @v: pointer to atomic_t 394 + * 395 + * Atomically updates @v to (@v | @i) with relaxed ordering. 396 + * 397 + * Unsafe to use in noinstr code; use raw_atomic_or() there. 398 + * 399 + * Return: Nothing. 400 + */ 920 401 static __always_inline void 921 402 atomic_or(int i, atomic_t *v) 922 403 { ··· 935 394 raw_atomic_or(i, v); 936 395 } 937 396 397 + /** 398 + * atomic_fetch_or() - atomic bitwise OR with full ordering 399 + * @i: int value 400 + * @v: pointer to atomic_t 401 + * 402 + * Atomically updates @v to (@v | @i) with full ordering. 403 + * 404 + * Unsafe to use in noinstr code; use raw_atomic_fetch_or() there. 405 + * 406 + * Return: The original value of @v. 407 + */ 938 408 static __always_inline int 939 409 atomic_fetch_or(int i, atomic_t *v) 940 410 { ··· 954 402 return raw_atomic_fetch_or(i, v); 955 403 } 956 404 405 + /** 406 + * atomic_fetch_or_acquire() - atomic bitwise OR with acquire ordering 407 + * @i: int value 408 + * @v: pointer to atomic_t 409 + * 410 + * Atomically updates @v to (@v | @i) with acquire ordering. 411 + * 412 + * Unsafe to use in noinstr code; use raw_atomic_fetch_or_acquire() there. 413 + * 414 + * Return: The original value of @v. 415 + */ 957 416 static __always_inline int 958 417 atomic_fetch_or_acquire(int i, atomic_t *v) 959 418 { ··· 972 409 return raw_atomic_fetch_or_acquire(i, v); 973 410 } 974 411 412 + /** 413 + * atomic_fetch_or_release() - atomic bitwise OR with release ordering 414 + * @i: int value 415 + * @v: pointer to atomic_t 416 + * 417 + * Atomically updates @v to (@v | @i) with release ordering. 418 + * 419 + * Unsafe to use in noinstr code; use raw_atomic_fetch_or_release() there. 420 + * 421 + * Return: The original value of @v. 422 + */ 975 423 static __always_inline int 976 424 atomic_fetch_or_release(int i, atomic_t *v) 977 425 { ··· 991 417 return raw_atomic_fetch_or_release(i, v); 992 418 } 993 419 420 + /** 421 + * atomic_fetch_or_relaxed() - atomic bitwise OR with relaxed ordering 422 + * @i: int value 423 + * @v: pointer to atomic_t 424 + * 425 + * Atomically updates @v to (@v | @i) with relaxed ordering. 426 + * 427 + * Unsafe to use in noinstr code; use raw_atomic_fetch_or_relaxed() there. 428 + * 429 + * Return: The original value of @v. 430 + */ 994 431 static __always_inline int 995 432 atomic_fetch_or_relaxed(int i, atomic_t *v) 996 433 { ··· 1009 424 return raw_atomic_fetch_or_relaxed(i, v); 1010 425 } 1011 426 427 + /** 428 + * atomic_xor() - atomic bitwise XOR with relaxed ordering 429 + * @i: int value 430 + * @v: pointer to atomic_t 431 + * 432 + * Atomically updates @v to (@v ^ @i) with relaxed ordering. 433 + * 434 + * Unsafe to use in noinstr code; use raw_atomic_xor() there. 435 + * 436 + * Return: Nothing. 437 + */ 1012 438 static __always_inline void 1013 439 atomic_xor(int i, atomic_t *v) 1014 440 { ··· 1027 431 raw_atomic_xor(i, v); 1028 432 } 1029 433 434 + /** 435 + * atomic_fetch_xor() - atomic bitwise XOR with full ordering 436 + * @i: int value 437 + * @v: pointer to atomic_t 438 + * 439 + * Atomically updates @v to (@v ^ @i) with full ordering. 440 + * 441 + * Unsafe to use in noinstr code; use raw_atomic_fetch_xor() there. 442 + * 443 + * Return: The original value of @v. 444 + */ 1030 445 static __always_inline int 1031 446 atomic_fetch_xor(int i, atomic_t *v) 1032 447 { ··· 1046 439 return raw_atomic_fetch_xor(i, v); 1047 440 } 1048 441 442 + /** 443 + * atomic_fetch_xor_acquire() - atomic bitwise XOR with acquire ordering 444 + * @i: int value 445 + * @v: pointer to atomic_t 446 + * 447 + * Atomically updates @v to (@v ^ @i) with acquire ordering. 448 + * 449 + * Unsafe to use in noinstr code; use raw_atomic_fetch_xor_acquire() there. 450 + * 451 + * Return: The original value of @v. 452 + */ 1049 453 static __always_inline int 1050 454 atomic_fetch_xor_acquire(int i, atomic_t *v) 1051 455 { ··· 1064 446 return raw_atomic_fetch_xor_acquire(i, v); 1065 447 } 1066 448 449 + /** 450 + * atomic_fetch_xor_release() - atomic bitwise XOR with release ordering 451 + * @i: int value 452 + * @v: pointer to atomic_t 453 + * 454 + * Atomically updates @v to (@v ^ @i) with release ordering. 455 + * 456 + * Unsafe to use in noinstr code; use raw_atomic_fetch_xor_release() there. 457 + * 458 + * Return: The original value of @v. 459 + */ 1067 460 static __always_inline int 1068 461 atomic_fetch_xor_release(int i, atomic_t *v) 1069 462 { ··· 1083 454 return raw_atomic_fetch_xor_release(i, v); 1084 455 } 1085 456 457 + /** 458 + * atomic_fetch_xor_relaxed() - atomic bitwise XOR with relaxed ordering 459 + * @i: int value 460 + * @v: pointer to atomic_t 461 + * 462 + * Atomically updates @v to (@v ^ @i) with relaxed ordering. 463 + * 464 + * Unsafe to use in noinstr code; use raw_atomic_fetch_xor_relaxed() there. 465 + * 466 + * Return: The original value of @v. 467 + */ 1086 468 static __always_inline int 1087 469 atomic_fetch_xor_relaxed(int i, atomic_t *v) 1088 470 { ··· 1101 461 return raw_atomic_fetch_xor_relaxed(i, v); 1102 462 } 1103 463 464 + /** 465 + * atomic_xchg() - atomic exchange with full ordering 466 + * @v: pointer to atomic_t 467 + * @new: int value to assign 468 + * 469 + * Atomically updates @v to @new with full ordering. 470 + * 471 + * Unsafe to use in noinstr code; use raw_atomic_xchg() there. 472 + * 473 + * Return: The original value of @v. 474 + */ 1104 475 static __always_inline int 1105 476 atomic_xchg(atomic_t *v, int new) 1106 477 { ··· 1120 469 return raw_atomic_xchg(v, new); 1121 470 } 1122 471 472 + /** 473 + * atomic_xchg_acquire() - atomic exchange with acquire ordering 474 + * @v: pointer to atomic_t 475 + * @new: int value to assign 476 + * 477 + * Atomically updates @v to @new with acquire ordering. 478 + * 479 + * Unsafe to use in noinstr code; use raw_atomic_xchg_acquire() there. 480 + * 481 + * Return: The original value of @v. 482 + */ 1123 483 static __always_inline int 1124 484 atomic_xchg_acquire(atomic_t *v, int new) 1125 485 { ··· 1138 476 return raw_atomic_xchg_acquire(v, new); 1139 477 } 1140 478 479 + /** 480 + * atomic_xchg_release() - atomic exchange with release ordering 481 + * @v: pointer to atomic_t 482 + * @new: int value to assign 483 + * 484 + * Atomically updates @v to @new with release ordering. 485 + * 486 + * Unsafe to use in noinstr code; use raw_atomic_xchg_release() there. 487 + * 488 + * Return: The original value of @v. 489 + */ 1141 490 static __always_inline int 1142 491 atomic_xchg_release(atomic_t *v, int new) 1143 492 { ··· 1157 484 return raw_atomic_xchg_release(v, new); 1158 485 } 1159 486 487 + /** 488 + * atomic_xchg_relaxed() - atomic exchange with relaxed ordering 489 + * @v: pointer to atomic_t 490 + * @new: int value to assign 491 + * 492 + * Atomically updates @v to @new with relaxed ordering. 493 + * 494 + * Unsafe to use in noinstr code; use raw_atomic_xchg_relaxed() there. 495 + * 496 + * Return: The original value of @v. 497 + */ 1160 498 static __always_inline int 1161 499 atomic_xchg_relaxed(atomic_t *v, int new) 1162 500 { ··· 1175 491 return raw_atomic_xchg_relaxed(v, new); 1176 492 } 1177 493 494 + /** 495 + * atomic_cmpxchg() - atomic compare and exchange with full ordering 496 + * @v: pointer to atomic_t 497 + * @old: int value to compare with 498 + * @new: int value to assign 499 + * 500 + * If (@v == @old), atomically updates @v to @new with full ordering. 501 + * 502 + * Unsafe to use in noinstr code; use raw_atomic_cmpxchg() there. 503 + * 504 + * Return: The original value of @v. 505 + */ 1178 506 static __always_inline int 1179 507 atomic_cmpxchg(atomic_t *v, int old, int new) 1180 508 { ··· 1195 499 return raw_atomic_cmpxchg(v, old, new); 1196 500 } 1197 501 502 + /** 503 + * atomic_cmpxchg_acquire() - atomic compare and exchange with acquire ordering 504 + * @v: pointer to atomic_t 505 + * @old: int value to compare with 506 + * @new: int value to assign 507 + * 508 + * If (@v == @old), atomically updates @v to @new with acquire ordering. 509 + * 510 + * Unsafe to use in noinstr code; use raw_atomic_cmpxchg_acquire() there. 511 + * 512 + * Return: The original value of @v. 513 + */ 1198 514 static __always_inline int 1199 515 atomic_cmpxchg_acquire(atomic_t *v, int old, int new) 1200 516 { ··· 1214 506 return raw_atomic_cmpxchg_acquire(v, old, new); 1215 507 } 1216 508 509 + /** 510 + * atomic_cmpxchg_release() - atomic compare and exchange with release ordering 511 + * @v: pointer to atomic_t 512 + * @old: int value to compare with 513 + * @new: int value to assign 514 + * 515 + * If (@v == @old), atomically updates @v to @new with release ordering. 516 + * 517 + * Unsafe to use in noinstr code; use raw_atomic_cmpxchg_release() there. 518 + * 519 + * Return: The original value of @v. 520 + */ 1217 521 static __always_inline int 1218 522 atomic_cmpxchg_release(atomic_t *v, int old, int new) 1219 523 { ··· 1234 514 return raw_atomic_cmpxchg_release(v, old, new); 1235 515 } 1236 516 517 + /** 518 + * atomic_cmpxchg_relaxed() - atomic compare and exchange with relaxed ordering 519 + * @v: pointer to atomic_t 520 + * @old: int value to compare with 521 + * @new: int value to assign 522 + * 523 + * If (@v == @old), atomically updates @v to @new with relaxed ordering. 524 + * 525 + * Unsafe to use in noinstr code; use raw_atomic_cmpxchg_relaxed() there. 526 + * 527 + * Return: The original value of @v. 528 + */ 1237 529 static __always_inline int 1238 530 atomic_cmpxchg_relaxed(atomic_t *v, int old, int new) 1239 531 { ··· 1253 521 return raw_atomic_cmpxchg_relaxed(v, old, new); 1254 522 } 1255 523 524 + /** 525 + * atomic_try_cmpxchg() - atomic compare and exchange with full ordering 526 + * @v: pointer to atomic_t 527 + * @old: pointer to int value to compare with 528 + * @new: int value to assign 529 + * 530 + * If (@v == @old), atomically updates @v to @new with full ordering. 531 + * Otherwise, updates @old to the current value of @v. 532 + * 533 + * Unsafe to use in noinstr code; use raw_atomic_try_cmpxchg() there. 534 + * 535 + * Return: @true if the exchange occured, @false otherwise. 536 + */ 1256 537 static __always_inline bool 1257 538 atomic_try_cmpxchg(atomic_t *v, int *old, int new) 1258 539 { ··· 1275 530 return raw_atomic_try_cmpxchg(v, old, new); 1276 531 } 1277 532 533 + /** 534 + * atomic_try_cmpxchg_acquire() - atomic compare and exchange with acquire ordering 535 + * @v: pointer to atomic_t 536 + * @old: pointer to int value to compare with 537 + * @new: int value to assign 538 + * 539 + * If (@v == @old), atomically updates @v to @new with acquire ordering. 540 + * Otherwise, updates @old to the current value of @v. 541 + * 542 + * Unsafe to use in noinstr code; use raw_atomic_try_cmpxchg_acquire() there. 543 + * 544 + * Return: @true if the exchange occured, @false otherwise. 545 + */ 1278 546 static __always_inline bool 1279 547 atomic_try_cmpxchg_acquire(atomic_t *v, int *old, int new) 1280 548 { ··· 1296 538 return raw_atomic_try_cmpxchg_acquire(v, old, new); 1297 539 } 1298 540 541 + /** 542 + * atomic_try_cmpxchg_release() - atomic compare and exchange with release ordering 543 + * @v: pointer to atomic_t 544 + * @old: pointer to int value to compare with 545 + * @new: int value to assign 546 + * 547 + * If (@v == @old), atomically updates @v to @new with release ordering. 548 + * Otherwise, updates @old to the current value of @v. 549 + * 550 + * Unsafe to use in noinstr code; use raw_atomic_try_cmpxchg_release() there. 551 + * 552 + * Return: @true if the exchange occured, @false otherwise. 553 + */ 1299 554 static __always_inline bool 1300 555 atomic_try_cmpxchg_release(atomic_t *v, int *old, int new) 1301 556 { ··· 1318 547 return raw_atomic_try_cmpxchg_release(v, old, new); 1319 548 } 1320 549 550 + /** 551 + * atomic_try_cmpxchg_relaxed() - atomic compare and exchange with relaxed ordering 552 + * @v: pointer to atomic_t 553 + * @old: pointer to int value to compare with 554 + * @new: int value to assign 555 + * 556 + * If (@v == @old), atomically updates @v to @new with relaxed ordering. 557 + * Otherwise, updates @old to the current value of @v. 558 + * 559 + * Unsafe to use in noinstr code; use raw_atomic_try_cmpxchg_relaxed() there. 560 + * 561 + * Return: @true if the exchange occured, @false otherwise. 562 + */ 1321 563 static __always_inline bool 1322 564 atomic_try_cmpxchg_relaxed(atomic_t *v, int *old, int new) 1323 565 { ··· 1339 555 return raw_atomic_try_cmpxchg_relaxed(v, old, new); 1340 556 } 1341 557 558 + /** 559 + * atomic_sub_and_test() - atomic subtract and test if zero with full ordering 560 + * @i: int value to add 561 + * @v: pointer to atomic_t 562 + * 563 + * Atomically updates @v to (@v - @i) with full ordering. 564 + * 565 + * Unsafe to use in noinstr code; use raw_atomic_sub_and_test() there. 566 + * 567 + * Return: @true if the resulting value of @v is zero, @false otherwise. 568 + */ 1342 569 static __always_inline bool 1343 570 atomic_sub_and_test(int i, atomic_t *v) 1344 571 { ··· 1358 563 return raw_atomic_sub_and_test(i, v); 1359 564 } 1360 565 566 + /** 567 + * atomic_dec_and_test() - atomic decrement and test if zero with full ordering 568 + * @v: pointer to atomic_t 569 + * 570 + * Atomically updates @v to (@v - 1) with full ordering. 571 + * 572 + * Unsafe to use in noinstr code; use raw_atomic_dec_and_test() there. 573 + * 574 + * Return: @true if the resulting value of @v is zero, @false otherwise. 575 + */ 1361 576 static __always_inline bool 1362 577 atomic_dec_and_test(atomic_t *v) 1363 578 { ··· 1376 571 return raw_atomic_dec_and_test(v); 1377 572 } 1378 573 574 + /** 575 + * atomic_inc_and_test() - atomic increment and test if zero with full ordering 576 + * @v: pointer to atomic_t 577 + * 578 + * Atomically updates @v to (@v + 1) with full ordering. 579 + * 580 + * Unsafe to use in noinstr code; use raw_atomic_inc_and_test() there. 581 + * 582 + * Return: @true if the resulting value of @v is zero, @false otherwise. 583 + */ 1379 584 static __always_inline bool 1380 585 atomic_inc_and_test(atomic_t *v) 1381 586 { ··· 1394 579 return raw_atomic_inc_and_test(v); 1395 580 } 1396 581 582 + /** 583 + * atomic_add_negative() - atomic add and test if negative with full ordering 584 + * @i: int value to add 585 + * @v: pointer to atomic_t 586 + * 587 + * Atomically updates @v to (@v + @i) with full ordering. 588 + * 589 + * Unsafe to use in noinstr code; use raw_atomic_add_negative() there. 590 + * 591 + * Return: @true if the resulting value of @v is negative, @false otherwise. 592 + */ 1397 593 static __always_inline bool 1398 594 atomic_add_negative(int i, atomic_t *v) 1399 595 { ··· 1413 587 return raw_atomic_add_negative(i, v); 1414 588 } 1415 589 590 + /** 591 + * atomic_add_negative_acquire() - atomic add and test if negative with acquire ordering 592 + * @i: int value to add 593 + * @v: pointer to atomic_t 594 + * 595 + * Atomically updates @v to (@v + @i) with acquire ordering. 596 + * 597 + * Unsafe to use in noinstr code; use raw_atomic_add_negative_acquire() there. 598 + * 599 + * Return: @true if the resulting value of @v is negative, @false otherwise. 600 + */ 1416 601 static __always_inline bool 1417 602 atomic_add_negative_acquire(int i, atomic_t *v) 1418 603 { ··· 1431 594 return raw_atomic_add_negative_acquire(i, v); 1432 595 } 1433 596 597 + /** 598 + * atomic_add_negative_release() - atomic add and test if negative with release ordering 599 + * @i: int value to add 600 + * @v: pointer to atomic_t 601 + * 602 + * Atomically updates @v to (@v + @i) with release ordering. 603 + * 604 + * Unsafe to use in noinstr code; use raw_atomic_add_negative_release() there. 605 + * 606 + * Return: @true if the resulting value of @v is negative, @false otherwise. 607 + */ 1434 608 static __always_inline bool 1435 609 atomic_add_negative_release(int i, atomic_t *v) 1436 610 { ··· 1450 602 return raw_atomic_add_negative_release(i, v); 1451 603 } 1452 604 605 + /** 606 + * atomic_add_negative_relaxed() - atomic add and test if negative with relaxed ordering 607 + * @i: int value to add 608 + * @v: pointer to atomic_t 609 + * 610 + * Atomically updates @v to (@v + @i) with relaxed ordering. 611 + * 612 + * Unsafe to use in noinstr code; use raw_atomic_add_negative_relaxed() there. 613 + * 614 + * Return: @true if the resulting value of @v is negative, @false otherwise. 615 + */ 1453 616 static __always_inline bool 1454 617 atomic_add_negative_relaxed(int i, atomic_t *v) 1455 618 { ··· 1468 609 return raw_atomic_add_negative_relaxed(i, v); 1469 610 } 1470 611 612 + /** 613 + * atomic_fetch_add_unless() - atomic add unless value with full ordering 614 + * @v: pointer to atomic_t 615 + * @a: int value to add 616 + * @u: int value to compare with 617 + * 618 + * If (@v != @u), atomically updates @v to (@v + @a) with full ordering. 619 + * 620 + * Unsafe to use in noinstr code; use raw_atomic_fetch_add_unless() there. 621 + * 622 + * Return: The original value of @v. 623 + */ 1471 624 static __always_inline int 1472 625 atomic_fetch_add_unless(atomic_t *v, int a, int u) 1473 626 { ··· 1488 617 return raw_atomic_fetch_add_unless(v, a, u); 1489 618 } 1490 619 620 + /** 621 + * atomic_add_unless() - atomic add unless value with full ordering 622 + * @v: pointer to atomic_t 623 + * @a: int value to add 624 + * @u: int value to compare with 625 + * 626 + * If (@v != @u), atomically updates @v to (@v + @a) with full ordering. 627 + * 628 + * Unsafe to use in noinstr code; use raw_atomic_add_unless() there. 629 + * 630 + * Return: @true if @v was updated, @false otherwise. 631 + */ 1491 632 static __always_inline bool 1492 633 atomic_add_unless(atomic_t *v, int a, int u) 1493 634 { ··· 1508 625 return raw_atomic_add_unless(v, a, u); 1509 626 } 1510 627 628 + /** 629 + * atomic_inc_not_zero() - atomic increment unless zero with full ordering 630 + * @v: pointer to atomic_t 631 + * 632 + * If (@v != 0), atomically updates @v to (@v + 1) with full ordering. 633 + * 634 + * Unsafe to use in noinstr code; use raw_atomic_inc_not_zero() there. 635 + * 636 + * Return: @true if @v was updated, @false otherwise. 637 + */ 1511 638 static __always_inline bool 1512 639 atomic_inc_not_zero(atomic_t *v) 1513 640 { ··· 1526 633 return raw_atomic_inc_not_zero(v); 1527 634 } 1528 635 636 + /** 637 + * atomic_inc_unless_negative() - atomic increment unless negative with full ordering 638 + * @v: pointer to atomic_t 639 + * 640 + * If (@v >= 0), atomically updates @v to (@v + 1) with full ordering. 641 + * 642 + * Unsafe to use in noinstr code; use raw_atomic_inc_unless_negative() there. 643 + * 644 + * Return: @true if @v was updated, @false otherwise. 645 + */ 1529 646 static __always_inline bool 1530 647 atomic_inc_unless_negative(atomic_t *v) 1531 648 { ··· 1544 641 return raw_atomic_inc_unless_negative(v); 1545 642 } 1546 643 644 + /** 645 + * atomic_dec_unless_positive() - atomic decrement unless positive with full ordering 646 + * @v: pointer to atomic_t 647 + * 648 + * If (@v <= 0), atomically updates @v to (@v - 1) with full ordering. 649 + * 650 + * Unsafe to use in noinstr code; use raw_atomic_dec_unless_positive() there. 651 + * 652 + * Return: @true if @v was updated, @false otherwise. 653 + */ 1547 654 static __always_inline bool 1548 655 atomic_dec_unless_positive(atomic_t *v) 1549 656 { ··· 1562 649 return raw_atomic_dec_unless_positive(v); 1563 650 } 1564 651 652 + /** 653 + * atomic_dec_if_positive() - atomic decrement if positive with full ordering 654 + * @v: pointer to atomic_t 655 + * 656 + * If (@v > 0), atomically updates @v to (@v - 1) with full ordering. 657 + * 658 + * Unsafe to use in noinstr code; use raw_atomic_dec_if_positive() there. 659 + * 660 + * Return: @true if @v was updated, @false otherwise. 661 + */ 1565 662 static __always_inline int 1566 663 atomic_dec_if_positive(atomic_t *v) 1567 664 { ··· 1580 657 return raw_atomic_dec_if_positive(v); 1581 658 } 1582 659 660 + /** 661 + * atomic64_read() - atomic load with relaxed ordering 662 + * @v: pointer to atomic64_t 663 + * 664 + * Atomically loads the value of @v with relaxed ordering. 665 + * 666 + * Unsafe to use in noinstr code; use raw_atomic64_read() there. 667 + * 668 + * Return: The value loaded from @v. 669 + */ 1583 670 static __always_inline s64 1584 671 atomic64_read(const atomic64_t *v) 1585 672 { ··· 1597 664 return raw_atomic64_read(v); 1598 665 } 1599 666 667 + /** 668 + * atomic64_read_acquire() - atomic load with acquire ordering 669 + * @v: pointer to atomic64_t 670 + * 671 + * Atomically loads the value of @v with acquire ordering. 672 + * 673 + * Unsafe to use in noinstr code; use raw_atomic64_read_acquire() there. 674 + * 675 + * Return: The value loaded from @v. 676 + */ 1600 677 static __always_inline s64 1601 678 atomic64_read_acquire(const atomic64_t *v) 1602 679 { ··· 1614 671 return raw_atomic64_read_acquire(v); 1615 672 } 1616 673 674 + /** 675 + * atomic64_set() - atomic set with relaxed ordering 676 + * @v: pointer to atomic64_t 677 + * @i: s64 value to assign 678 + * 679 + * Atomically sets @v to @i with relaxed ordering. 680 + * 681 + * Unsafe to use in noinstr code; use raw_atomic64_set() there. 682 + * 683 + * Return: Nothing. 684 + */ 1617 685 static __always_inline void 1618 686 atomic64_set(atomic64_t *v, s64 i) 1619 687 { ··· 1632 678 raw_atomic64_set(v, i); 1633 679 } 1634 680 681 + /** 682 + * atomic64_set_release() - atomic set with release ordering 683 + * @v: pointer to atomic64_t 684 + * @i: s64 value to assign 685 + * 686 + * Atomically sets @v to @i with release ordering. 687 + * 688 + * Unsafe to use in noinstr code; use raw_atomic64_set_release() there. 689 + * 690 + * Return: Nothing. 691 + */ 1635 692 static __always_inline void 1636 693 atomic64_set_release(atomic64_t *v, s64 i) 1637 694 { ··· 1651 686 raw_atomic64_set_release(v, i); 1652 687 } 1653 688 689 + /** 690 + * atomic64_add() - atomic add with relaxed ordering 691 + * @i: s64 value to add 692 + * @v: pointer to atomic64_t 693 + * 694 + * Atomically updates @v to (@v + @i) with relaxed ordering. 695 + * 696 + * Unsafe to use in noinstr code; use raw_atomic64_add() there. 697 + * 698 + * Return: Nothing. 699 + */ 1654 700 static __always_inline void 1655 701 atomic64_add(s64 i, atomic64_t *v) 1656 702 { ··· 1669 693 raw_atomic64_add(i, v); 1670 694 } 1671 695 696 + /** 697 + * atomic64_add_return() - atomic add with full ordering 698 + * @i: s64 value to add 699 + * @v: pointer to atomic64_t 700 + * 701 + * Atomically updates @v to (@v + @i) with full ordering. 702 + * 703 + * Unsafe to use in noinstr code; use raw_atomic64_add_return() there. 704 + * 705 + * Return: The updated value of @v. 706 + */ 1672 707 static __always_inline s64 1673 708 atomic64_add_return(s64 i, atomic64_t *v) 1674 709 { ··· 1688 701 return raw_atomic64_add_return(i, v); 1689 702 } 1690 703 704 + /** 705 + * atomic64_add_return_acquire() - atomic add with acquire ordering 706 + * @i: s64 value to add 707 + * @v: pointer to atomic64_t 708 + * 709 + * Atomically updates @v to (@v + @i) with acquire ordering. 710 + * 711 + * Unsafe to use in noinstr code; use raw_atomic64_add_return_acquire() there. 712 + * 713 + * Return: The updated value of @v. 714 + */ 1691 715 static __always_inline s64 1692 716 atomic64_add_return_acquire(s64 i, atomic64_t *v) 1693 717 { ··· 1706 708 return raw_atomic64_add_return_acquire(i, v); 1707 709 } 1708 710 711 + /** 712 + * atomic64_add_return_release() - atomic add with release ordering 713 + * @i: s64 value to add 714 + * @v: pointer to atomic64_t 715 + * 716 + * Atomically updates @v to (@v + @i) with release ordering. 717 + * 718 + * Unsafe to use in noinstr code; use raw_atomic64_add_return_release() there. 719 + * 720 + * Return: The updated value of @v. 721 + */ 1709 722 static __always_inline s64 1710 723 atomic64_add_return_release(s64 i, atomic64_t *v) 1711 724 { ··· 1725 716 return raw_atomic64_add_return_release(i, v); 1726 717 } 1727 718 719 + /** 720 + * atomic64_add_return_relaxed() - atomic add with relaxed ordering 721 + * @i: s64 value to add 722 + * @v: pointer to atomic64_t 723 + * 724 + * Atomically updates @v to (@v + @i) with relaxed ordering. 725 + * 726 + * Unsafe to use in noinstr code; use raw_atomic64_add_return_relaxed() there. 727 + * 728 + * Return: The updated value of @v. 729 + */ 1728 730 static __always_inline s64 1729 731 atomic64_add_return_relaxed(s64 i, atomic64_t *v) 1730 732 { ··· 1743 723 return raw_atomic64_add_return_relaxed(i, v); 1744 724 } 1745 725 726 + /** 727 + * atomic64_fetch_add() - atomic add with full ordering 728 + * @i: s64 value to add 729 + * @v: pointer to atomic64_t 730 + * 731 + * Atomically updates @v to (@v + @i) with full ordering. 732 + * 733 + * Unsafe to use in noinstr code; use raw_atomic64_fetch_add() there. 734 + * 735 + * Return: The original value of @v. 736 + */ 1746 737 static __always_inline s64 1747 738 atomic64_fetch_add(s64 i, atomic64_t *v) 1748 739 { ··· 1762 731 return raw_atomic64_fetch_add(i, v); 1763 732 } 1764 733 734 + /** 735 + * atomic64_fetch_add_acquire() - atomic add with acquire ordering 736 + * @i: s64 value to add 737 + * @v: pointer to atomic64_t 738 + * 739 + * Atomically updates @v to (@v + @i) with acquire ordering. 740 + * 741 + * Unsafe to use in noinstr code; use raw_atomic64_fetch_add_acquire() there. 742 + * 743 + * Return: The original value of @v. 744 + */ 1765 745 static __always_inline s64 1766 746 atomic64_fetch_add_acquire(s64 i, atomic64_t *v) 1767 747 { ··· 1780 738 return raw_atomic64_fetch_add_acquire(i, v); 1781 739 } 1782 740 741 + /** 742 + * atomic64_fetch_add_release() - atomic add with release ordering 743 + * @i: s64 value to add 744 + * @v: pointer to atomic64_t 745 + * 746 + * Atomically updates @v to (@v + @i) with release ordering. 747 + * 748 + * Unsafe to use in noinstr code; use raw_atomic64_fetch_add_release() there. 749 + * 750 + * Return: The original value of @v. 751 + */ 1783 752 static __always_inline s64 1784 753 atomic64_fetch_add_release(s64 i, atomic64_t *v) 1785 754 { ··· 1799 746 return raw_atomic64_fetch_add_release(i, v); 1800 747 } 1801 748 749 + /** 750 + * atomic64_fetch_add_relaxed() - atomic add with relaxed ordering 751 + * @i: s64 value to add 752 + * @v: pointer to atomic64_t 753 + * 754 + * Atomically updates @v to (@v + @i) with relaxed ordering. 755 + * 756 + * Unsafe to use in noinstr code; use raw_atomic64_fetch_add_relaxed() there. 757 + * 758 + * Return: The original value of @v. 759 + */ 1802 760 static __always_inline s64 1803 761 atomic64_fetch_add_relaxed(s64 i, atomic64_t *v) 1804 762 { ··· 1817 753 return raw_atomic64_fetch_add_relaxed(i, v); 1818 754 } 1819 755 756 + /** 757 + * atomic64_sub() - atomic subtract with relaxed ordering 758 + * @i: s64 value to subtract 759 + * @v: pointer to atomic64_t 760 + * 761 + * Atomically updates @v to (@v - @i) with relaxed ordering. 762 + * 763 + * Unsafe to use in noinstr code; use raw_atomic64_sub() there. 764 + * 765 + * Return: Nothing. 766 + */ 1820 767 static __always_inline void 1821 768 atomic64_sub(s64 i, atomic64_t *v) 1822 769 { ··· 1835 760 raw_atomic64_sub(i, v); 1836 761 } 1837 762 763 + /** 764 + * atomic64_sub_return() - atomic subtract with full ordering 765 + * @i: s64 value to subtract 766 + * @v: pointer to atomic64_t 767 + * 768 + * Atomically updates @v to (@v - @i) with full ordering. 769 + * 770 + * Unsafe to use in noinstr code; use raw_atomic64_sub_return() there. 771 + * 772 + * Return: The updated value of @v. 773 + */ 1838 774 static __always_inline s64 1839 775 atomic64_sub_return(s64 i, atomic64_t *v) 1840 776 { ··· 1854 768 return raw_atomic64_sub_return(i, v); 1855 769 } 1856 770 771 + /** 772 + * atomic64_sub_return_acquire() - atomic subtract with acquire ordering 773 + * @i: s64 value to subtract 774 + * @v: pointer to atomic64_t 775 + * 776 + * Atomically updates @v to (@v - @i) with acquire ordering. 777 + * 778 + * Unsafe to use in noinstr code; use raw_atomic64_sub_return_acquire() there. 779 + * 780 + * Return: The updated value of @v. 781 + */ 1857 782 static __always_inline s64 1858 783 atomic64_sub_return_acquire(s64 i, atomic64_t *v) 1859 784 { ··· 1872 775 return raw_atomic64_sub_return_acquire(i, v); 1873 776 } 1874 777 778 + /** 779 + * atomic64_sub_return_release() - atomic subtract with release ordering 780 + * @i: s64 value to subtract 781 + * @v: pointer to atomic64_t 782 + * 783 + * Atomically updates @v to (@v - @i) with release ordering. 784 + * 785 + * Unsafe to use in noinstr code; use raw_atomic64_sub_return_release() there. 786 + * 787 + * Return: The updated value of @v. 788 + */ 1875 789 static __always_inline s64 1876 790 atomic64_sub_return_release(s64 i, atomic64_t *v) 1877 791 { ··· 1891 783 return raw_atomic64_sub_return_release(i, v); 1892 784 } 1893 785 786 + /** 787 + * atomic64_sub_return_relaxed() - atomic subtract with relaxed ordering 788 + * @i: s64 value to subtract 789 + * @v: pointer to atomic64_t 790 + * 791 + * Atomically updates @v to (@v - @i) with relaxed ordering. 792 + * 793 + * Unsafe to use in noinstr code; use raw_atomic64_sub_return_relaxed() there. 794 + * 795 + * Return: The updated value of @v. 796 + */ 1894 797 static __always_inline s64 1895 798 atomic64_sub_return_relaxed(s64 i, atomic64_t *v) 1896 799 { ··· 1909 790 return raw_atomic64_sub_return_relaxed(i, v); 1910 791 } 1911 792 793 + /** 794 + * atomic64_fetch_sub() - atomic subtract with full ordering 795 + * @i: s64 value to subtract 796 + * @v: pointer to atomic64_t 797 + * 798 + * Atomically updates @v to (@v - @i) with full ordering. 799 + * 800 + * Unsafe to use in noinstr code; use raw_atomic64_fetch_sub() there. 801 + * 802 + * Return: The original value of @v. 803 + */ 1912 804 static __always_inline s64 1913 805 atomic64_fetch_sub(s64 i, atomic64_t *v) 1914 806 { ··· 1928 798 return raw_atomic64_fetch_sub(i, v); 1929 799 } 1930 800 801 + /** 802 + * atomic64_fetch_sub_acquire() - atomic subtract with acquire ordering 803 + * @i: s64 value to subtract 804 + * @v: pointer to atomic64_t 805 + * 806 + * Atomically updates @v to (@v - @i) with acquire ordering. 807 + * 808 + * Unsafe to use in noinstr code; use raw_atomic64_fetch_sub_acquire() there. 809 + * 810 + * Return: The original value of @v. 811 + */ 1931 812 static __always_inline s64 1932 813 atomic64_fetch_sub_acquire(s64 i, atomic64_t *v) 1933 814 { ··· 1946 805 return raw_atomic64_fetch_sub_acquire(i, v); 1947 806 } 1948 807 808 + /** 809 + * atomic64_fetch_sub_release() - atomic subtract with release ordering 810 + * @i: s64 value to subtract 811 + * @v: pointer to atomic64_t 812 + * 813 + * Atomically updates @v to (@v - @i) with release ordering. 814 + * 815 + * Unsafe to use in noinstr code; use raw_atomic64_fetch_sub_release() there. 816 + * 817 + * Return: The original value of @v. 818 + */ 1949 819 static __always_inline s64 1950 820 atomic64_fetch_sub_release(s64 i, atomic64_t *v) 1951 821 { ··· 1965 813 return raw_atomic64_fetch_sub_release(i, v); 1966 814 } 1967 815 816 + /** 817 + * atomic64_fetch_sub_relaxed() - atomic subtract with relaxed ordering 818 + * @i: s64 value to subtract 819 + * @v: pointer to atomic64_t 820 + * 821 + * Atomically updates @v to (@v - @i) with relaxed ordering. 822 + * 823 + * Unsafe to use in noinstr code; use raw_atomic64_fetch_sub_relaxed() there. 824 + * 825 + * Return: The original value of @v. 826 + */ 1968 827 static __always_inline s64 1969 828 atomic64_fetch_sub_relaxed(s64 i, atomic64_t *v) 1970 829 { ··· 1983 820 return raw_atomic64_fetch_sub_relaxed(i, v); 1984 821 } 1985 822 823 + /** 824 + * atomic64_inc() - atomic increment with relaxed ordering 825 + * @v: pointer to atomic64_t 826 + * 827 + * Atomically updates @v to (@v + 1) with relaxed ordering. 828 + * 829 + * Unsafe to use in noinstr code; use raw_atomic64_inc() there. 830 + * 831 + * Return: Nothing. 832 + */ 1986 833 static __always_inline void 1987 834 atomic64_inc(atomic64_t *v) 1988 835 { ··· 2000 827 raw_atomic64_inc(v); 2001 828 } 2002 829 830 + /** 831 + * atomic64_inc_return() - atomic increment with full ordering 832 + * @v: pointer to atomic64_t 833 + * 834 + * Atomically updates @v to (@v + 1) with full ordering. 835 + * 836 + * Unsafe to use in noinstr code; use raw_atomic64_inc_return() there. 837 + * 838 + * Return: The updated value of @v. 839 + */ 2003 840 static __always_inline s64 2004 841 atomic64_inc_return(atomic64_t *v) 2005 842 { ··· 2018 835 return raw_atomic64_inc_return(v); 2019 836 } 2020 837 838 + /** 839 + * atomic64_inc_return_acquire() - atomic increment with acquire ordering 840 + * @v: pointer to atomic64_t 841 + * 842 + * Atomically updates @v to (@v + 1) with acquire ordering. 843 + * 844 + * Unsafe to use in noinstr code; use raw_atomic64_inc_return_acquire() there. 845 + * 846 + * Return: The updated value of @v. 847 + */ 2021 848 static __always_inline s64 2022 849 atomic64_inc_return_acquire(atomic64_t *v) 2023 850 { ··· 2035 842 return raw_atomic64_inc_return_acquire(v); 2036 843 } 2037 844 845 + /** 846 + * atomic64_inc_return_release() - atomic increment with release ordering 847 + * @v: pointer to atomic64_t 848 + * 849 + * Atomically updates @v to (@v + 1) with release ordering. 850 + * 851 + * Unsafe to use in noinstr code; use raw_atomic64_inc_return_release() there. 852 + * 853 + * Return: The updated value of @v. 854 + */ 2038 855 static __always_inline s64 2039 856 atomic64_inc_return_release(atomic64_t *v) 2040 857 { ··· 2053 850 return raw_atomic64_inc_return_release(v); 2054 851 } 2055 852 853 + /** 854 + * atomic64_inc_return_relaxed() - atomic increment with relaxed ordering 855 + * @v: pointer to atomic64_t 856 + * 857 + * Atomically updates @v to (@v + 1) with relaxed ordering. 858 + * 859 + * Unsafe to use in noinstr code; use raw_atomic64_inc_return_relaxed() there. 860 + * 861 + * Return: The updated value of @v. 862 + */ 2056 863 static __always_inline s64 2057 864 atomic64_inc_return_relaxed(atomic64_t *v) 2058 865 { ··· 2070 857 return raw_atomic64_inc_return_relaxed(v); 2071 858 } 2072 859 860 + /** 861 + * atomic64_fetch_inc() - atomic increment with full ordering 862 + * @v: pointer to atomic64_t 863 + * 864 + * Atomically updates @v to (@v + 1) with full ordering. 865 + * 866 + * Unsafe to use in noinstr code; use raw_atomic64_fetch_inc() there. 867 + * 868 + * Return: The original value of @v. 869 + */ 2073 870 static __always_inline s64 2074 871 atomic64_fetch_inc(atomic64_t *v) 2075 872 { ··· 2088 865 return raw_atomic64_fetch_inc(v); 2089 866 } 2090 867 868 + /** 869 + * atomic64_fetch_inc_acquire() - atomic increment with acquire ordering 870 + * @v: pointer to atomic64_t 871 + * 872 + * Atomically updates @v to (@v + 1) with acquire ordering. 873 + * 874 + * Unsafe to use in noinstr code; use raw_atomic64_fetch_inc_acquire() there. 875 + * 876 + * Return: The original value of @v. 877 + */ 2091 878 static __always_inline s64 2092 879 atomic64_fetch_inc_acquire(atomic64_t *v) 2093 880 { ··· 2105 872 return raw_atomic64_fetch_inc_acquire(v); 2106 873 } 2107 874 875 + /** 876 + * atomic64_fetch_inc_release() - atomic increment with release ordering 877 + * @v: pointer to atomic64_t 878 + * 879 + * Atomically updates @v to (@v + 1) with release ordering. 880 + * 881 + * Unsafe to use in noinstr code; use raw_atomic64_fetch_inc_release() there. 882 + * 883 + * Return: The original value of @v. 884 + */ 2108 885 static __always_inline s64 2109 886 atomic64_fetch_inc_release(atomic64_t *v) 2110 887 { ··· 2123 880 return raw_atomic64_fetch_inc_release(v); 2124 881 } 2125 882 883 + /** 884 + * atomic64_fetch_inc_relaxed() - atomic increment with relaxed ordering 885 + * @v: pointer to atomic64_t 886 + * 887 + * Atomically updates @v to (@v + 1) with relaxed ordering. 888 + * 889 + * Unsafe to use in noinstr code; use raw_atomic64_fetch_inc_relaxed() there. 890 + * 891 + * Return: The original value of @v. 892 + */ 2126 893 static __always_inline s64 2127 894 atomic64_fetch_inc_relaxed(atomic64_t *v) 2128 895 { ··· 2140 887 return raw_atomic64_fetch_inc_relaxed(v); 2141 888 } 2142 889 890 + /** 891 + * atomic64_dec() - atomic decrement with relaxed ordering 892 + * @v: pointer to atomic64_t 893 + * 894 + * Atomically updates @v to (@v - 1) with relaxed ordering. 895 + * 896 + * Unsafe to use in noinstr code; use raw_atomic64_dec() there. 897 + * 898 + * Return: Nothing. 899 + */ 2143 900 static __always_inline void 2144 901 atomic64_dec(atomic64_t *v) 2145 902 { ··· 2157 894 raw_atomic64_dec(v); 2158 895 } 2159 896 897 + /** 898 + * atomic64_dec_return() - atomic decrement with full ordering 899 + * @v: pointer to atomic64_t 900 + * 901 + * Atomically updates @v to (@v - 1) with full ordering. 902 + * 903 + * Unsafe to use in noinstr code; use raw_atomic64_dec_return() there. 904 + * 905 + * Return: The updated value of @v. 906 + */ 2160 907 static __always_inline s64 2161 908 atomic64_dec_return(atomic64_t *v) 2162 909 { ··· 2175 902 return raw_atomic64_dec_return(v); 2176 903 } 2177 904 905 + /** 906 + * atomic64_dec_return_acquire() - atomic decrement with acquire ordering 907 + * @v: pointer to atomic64_t 908 + * 909 + * Atomically updates @v to (@v - 1) with acquire ordering. 910 + * 911 + * Unsafe to use in noinstr code; use raw_atomic64_dec_return_acquire() there. 912 + * 913 + * Return: The updated value of @v. 914 + */ 2178 915 static __always_inline s64 2179 916 atomic64_dec_return_acquire(atomic64_t *v) 2180 917 { ··· 2192 909 return raw_atomic64_dec_return_acquire(v); 2193 910 } 2194 911 912 + /** 913 + * atomic64_dec_return_release() - atomic decrement with release ordering 914 + * @v: pointer to atomic64_t 915 + * 916 + * Atomically updates @v to (@v - 1) with release ordering. 917 + * 918 + * Unsafe to use in noinstr code; use raw_atomic64_dec_return_release() there. 919 + * 920 + * Return: The updated value of @v. 921 + */ 2195 922 static __always_inline s64 2196 923 atomic64_dec_return_release(atomic64_t *v) 2197 924 { ··· 2210 917 return raw_atomic64_dec_return_release(v); 2211 918 } 2212 919 920 + /** 921 + * atomic64_dec_return_relaxed() - atomic decrement with relaxed ordering 922 + * @v: pointer to atomic64_t 923 + * 924 + * Atomically updates @v to (@v - 1) with relaxed ordering. 925 + * 926 + * Unsafe to use in noinstr code; use raw_atomic64_dec_return_relaxed() there. 927 + * 928 + * Return: The updated value of @v. 929 + */ 2213 930 static __always_inline s64 2214 931 atomic64_dec_return_relaxed(atomic64_t *v) 2215 932 { ··· 2227 924 return raw_atomic64_dec_return_relaxed(v); 2228 925 } 2229 926 927 + /** 928 + * atomic64_fetch_dec() - atomic decrement with full ordering 929 + * @v: pointer to atomic64_t 930 + * 931 + * Atomically updates @v to (@v - 1) with full ordering. 932 + * 933 + * Unsafe to use in noinstr code; use raw_atomic64_fetch_dec() there. 934 + * 935 + * Return: The original value of @v. 936 + */ 2230 937 static __always_inline s64 2231 938 atomic64_fetch_dec(atomic64_t *v) 2232 939 { ··· 2245 932 return raw_atomic64_fetch_dec(v); 2246 933 } 2247 934 935 + /** 936 + * atomic64_fetch_dec_acquire() - atomic decrement with acquire ordering 937 + * @v: pointer to atomic64_t 938 + * 939 + * Atomically updates @v to (@v - 1) with acquire ordering. 940 + * 941 + * Unsafe to use in noinstr code; use raw_atomic64_fetch_dec_acquire() there. 942 + * 943 + * Return: The original value of @v. 944 + */ 2248 945 static __always_inline s64 2249 946 atomic64_fetch_dec_acquire(atomic64_t *v) 2250 947 { ··· 2262 939 return raw_atomic64_fetch_dec_acquire(v); 2263 940 } 2264 941 942 + /** 943 + * atomic64_fetch_dec_release() - atomic decrement with release ordering 944 + * @v: pointer to atomic64_t 945 + * 946 + * Atomically updates @v to (@v - 1) with release ordering. 947 + * 948 + * Unsafe to use in noinstr code; use raw_atomic64_fetch_dec_release() there. 949 + * 950 + * Return: The original value of @v. 951 + */ 2265 952 static __always_inline s64 2266 953 atomic64_fetch_dec_release(atomic64_t *v) 2267 954 { ··· 2280 947 return raw_atomic64_fetch_dec_release(v); 2281 948 } 2282 949 950 + /** 951 + * atomic64_fetch_dec_relaxed() - atomic decrement with relaxed ordering 952 + * @v: pointer to atomic64_t 953 + * 954 + * Atomically updates @v to (@v - 1) with relaxed ordering. 955 + * 956 + * Unsafe to use in noinstr code; use raw_atomic64_fetch_dec_relaxed() there. 957 + * 958 + * Return: The original value of @v. 959 + */ 2283 960 static __always_inline s64 2284 961 atomic64_fetch_dec_relaxed(atomic64_t *v) 2285 962 { ··· 2297 954 return raw_atomic64_fetch_dec_relaxed(v); 2298 955 } 2299 956 957 + /** 958 + * atomic64_and() - atomic bitwise AND with relaxed ordering 959 + * @i: s64 value 960 + * @v: pointer to atomic64_t 961 + * 962 + * Atomically updates @v to (@v & @i) with relaxed ordering. 963 + * 964 + * Unsafe to use in noinstr code; use raw_atomic64_and() there. 965 + * 966 + * Return: Nothing. 967 + */ 2300 968 static __always_inline void 2301 969 atomic64_and(s64 i, atomic64_t *v) 2302 970 { ··· 2315 961 raw_atomic64_and(i, v); 2316 962 } 2317 963 964 + /** 965 + * atomic64_fetch_and() - atomic bitwise AND with full ordering 966 + * @i: s64 value 967 + * @v: pointer to atomic64_t 968 + * 969 + * Atomically updates @v to (@v & @i) with full ordering. 970 + * 971 + * Unsafe to use in noinstr code; use raw_atomic64_fetch_and() there. 972 + * 973 + * Return: The original value of @v. 974 + */ 2318 975 static __always_inline s64 2319 976 atomic64_fetch_and(s64 i, atomic64_t *v) 2320 977 { ··· 2334 969 return raw_atomic64_fetch_and(i, v); 2335 970 } 2336 971 972 + /** 973 + * atomic64_fetch_and_acquire() - atomic bitwise AND with acquire ordering 974 + * @i: s64 value 975 + * @v: pointer to atomic64_t 976 + * 977 + * Atomically updates @v to (@v & @i) with acquire ordering. 978 + * 979 + * Unsafe to use in noinstr code; use raw_atomic64_fetch_and_acquire() there. 980 + * 981 + * Return: The original value of @v. 982 + */ 2337 983 static __always_inline s64 2338 984 atomic64_fetch_and_acquire(s64 i, atomic64_t *v) 2339 985 { ··· 2352 976 return raw_atomic64_fetch_and_acquire(i, v); 2353 977 } 2354 978 979 + /** 980 + * atomic64_fetch_and_release() - atomic bitwise AND with release ordering 981 + * @i: s64 value 982 + * @v: pointer to atomic64_t 983 + * 984 + * Atomically updates @v to (@v & @i) with release ordering. 985 + * 986 + * Unsafe to use in noinstr code; use raw_atomic64_fetch_and_release() there. 987 + * 988 + * Return: The original value of @v. 989 + */ 2355 990 static __always_inline s64 2356 991 atomic64_fetch_and_release(s64 i, atomic64_t *v) 2357 992 { ··· 2371 984 return raw_atomic64_fetch_and_release(i, v); 2372 985 } 2373 986 987 + /** 988 + * atomic64_fetch_and_relaxed() - atomic bitwise AND with relaxed ordering 989 + * @i: s64 value 990 + * @v: pointer to atomic64_t 991 + * 992 + * Atomically updates @v to (@v & @i) with relaxed ordering. 993 + * 994 + * Unsafe to use in noinstr code; use raw_atomic64_fetch_and_relaxed() there. 995 + * 996 + * Return: The original value of @v. 997 + */ 2374 998 static __always_inline s64 2375 999 atomic64_fetch_and_relaxed(s64 i, atomic64_t *v) 2376 1000 { ··· 2389 991 return raw_atomic64_fetch_and_relaxed(i, v); 2390 992 } 2391 993 994 + /** 995 + * atomic64_andnot() - atomic bitwise AND NOT with relaxed ordering 996 + * @i: s64 value 997 + * @v: pointer to atomic64_t 998 + * 999 + * Atomically updates @v to (@v & ~@i) with relaxed ordering. 1000 + * 1001 + * Unsafe to use in noinstr code; use raw_atomic64_andnot() there. 1002 + * 1003 + * Return: Nothing. 1004 + */ 2392 1005 static __always_inline void 2393 1006 atomic64_andnot(s64 i, atomic64_t *v) 2394 1007 { ··· 2407 998 raw_atomic64_andnot(i, v); 2408 999 } 2409 1000 1001 + /** 1002 + * atomic64_fetch_andnot() - atomic bitwise AND NOT with full ordering 1003 + * @i: s64 value 1004 + * @v: pointer to atomic64_t 1005 + * 1006 + * Atomically updates @v to (@v & ~@i) with full ordering. 1007 + * 1008 + * Unsafe to use in noinstr code; use raw_atomic64_fetch_andnot() there. 1009 + * 1010 + * Return: The original value of @v. 1011 + */ 2410 1012 static __always_inline s64 2411 1013 atomic64_fetch_andnot(s64 i, atomic64_t *v) 2412 1014 { ··· 2426 1006 return raw_atomic64_fetch_andnot(i, v); 2427 1007 } 2428 1008 1009 + /** 1010 + * atomic64_fetch_andnot_acquire() - atomic bitwise AND NOT with acquire ordering 1011 + * @i: s64 value 1012 + * @v: pointer to atomic64_t 1013 + * 1014 + * Atomically updates @v to (@v & ~@i) with acquire ordering. 1015 + * 1016 + * Unsafe to use in noinstr code; use raw_atomic64_fetch_andnot_acquire() there. 1017 + * 1018 + * Return: The original value of @v. 1019 + */ 2429 1020 static __always_inline s64 2430 1021 atomic64_fetch_andnot_acquire(s64 i, atomic64_t *v) 2431 1022 { ··· 2444 1013 return raw_atomic64_fetch_andnot_acquire(i, v); 2445 1014 } 2446 1015 1016 + /** 1017 + * atomic64_fetch_andnot_release() - atomic bitwise AND NOT with release ordering 1018 + * @i: s64 value 1019 + * @v: pointer to atomic64_t 1020 + * 1021 + * Atomically updates @v to (@v & ~@i) with release ordering. 1022 + * 1023 + * Unsafe to use in noinstr code; use raw_atomic64_fetch_andnot_release() there. 1024 + * 1025 + * Return: The original value of @v. 1026 + */ 2447 1027 static __always_inline s64 2448 1028 atomic64_fetch_andnot_release(s64 i, atomic64_t *v) 2449 1029 { ··· 2463 1021 return raw_atomic64_fetch_andnot_release(i, v); 2464 1022 } 2465 1023 1024 + /** 1025 + * atomic64_fetch_andnot_relaxed() - atomic bitwise AND NOT with relaxed ordering 1026 + * @i: s64 value 1027 + * @v: pointer to atomic64_t 1028 + * 1029 + * Atomically updates @v to (@v & ~@i) with relaxed ordering. 1030 + * 1031 + * Unsafe to use in noinstr code; use raw_atomic64_fetch_andnot_relaxed() there. 1032 + * 1033 + * Return: The original value of @v. 1034 + */ 2466 1035 static __always_inline s64 2467 1036 atomic64_fetch_andnot_relaxed(s64 i, atomic64_t *v) 2468 1037 { ··· 2481 1028 return raw_atomic64_fetch_andnot_relaxed(i, v); 2482 1029 } 2483 1030 1031 + /** 1032 + * atomic64_or() - atomic bitwise OR with relaxed ordering 1033 + * @i: s64 value 1034 + * @v: pointer to atomic64_t 1035 + * 1036 + * Atomically updates @v to (@v | @i) with relaxed ordering. 1037 + * 1038 + * Unsafe to use in noinstr code; use raw_atomic64_or() there. 1039 + * 1040 + * Return: Nothing. 1041 + */ 2484 1042 static __always_inline void 2485 1043 atomic64_or(s64 i, atomic64_t *v) 2486 1044 { ··· 2499 1035 raw_atomic64_or(i, v); 2500 1036 } 2501 1037 1038 + /** 1039 + * atomic64_fetch_or() - atomic bitwise OR with full ordering 1040 + * @i: s64 value 1041 + * @v: pointer to atomic64_t 1042 + * 1043 + * Atomically updates @v to (@v | @i) with full ordering. 1044 + * 1045 + * Unsafe to use in noinstr code; use raw_atomic64_fetch_or() there. 1046 + * 1047 + * Return: The original value of @v. 1048 + */ 2502 1049 static __always_inline s64 2503 1050 atomic64_fetch_or(s64 i, atomic64_t *v) 2504 1051 { ··· 2518 1043 return raw_atomic64_fetch_or(i, v); 2519 1044 } 2520 1045 1046 + /** 1047 + * atomic64_fetch_or_acquire() - atomic bitwise OR with acquire ordering 1048 + * @i: s64 value 1049 + * @v: pointer to atomic64_t 1050 + * 1051 + * Atomically updates @v to (@v | @i) with acquire ordering. 1052 + * 1053 + * Unsafe to use in noinstr code; use raw_atomic64_fetch_or_acquire() there. 1054 + * 1055 + * Return: The original value of @v. 1056 + */ 2521 1057 static __always_inline s64 2522 1058 atomic64_fetch_or_acquire(s64 i, atomic64_t *v) 2523 1059 { ··· 2536 1050 return raw_atomic64_fetch_or_acquire(i, v); 2537 1051 } 2538 1052 1053 + /** 1054 + * atomic64_fetch_or_release() - atomic bitwise OR with release ordering 1055 + * @i: s64 value 1056 + * @v: pointer to atomic64_t 1057 + * 1058 + * Atomically updates @v to (@v | @i) with release ordering. 1059 + * 1060 + * Unsafe to use in noinstr code; use raw_atomic64_fetch_or_release() there. 1061 + * 1062 + * Return: The original value of @v. 1063 + */ 2539 1064 static __always_inline s64 2540 1065 atomic64_fetch_or_release(s64 i, atomic64_t *v) 2541 1066 { ··· 2555 1058 return raw_atomic64_fetch_or_release(i, v); 2556 1059 } 2557 1060 1061 + /** 1062 + * atomic64_fetch_or_relaxed() - atomic bitwise OR with relaxed ordering 1063 + * @i: s64 value 1064 + * @v: pointer to atomic64_t 1065 + * 1066 + * Atomically updates @v to (@v | @i) with relaxed ordering. 1067 + * 1068 + * Unsafe to use in noinstr code; use raw_atomic64_fetch_or_relaxed() there. 1069 + * 1070 + * Return: The original value of @v. 1071 + */ 2558 1072 static __always_inline s64 2559 1073 atomic64_fetch_or_relaxed(s64 i, atomic64_t *v) 2560 1074 { ··· 2573 1065 return raw_atomic64_fetch_or_relaxed(i, v); 2574 1066 } 2575 1067 1068 + /** 1069 + * atomic64_xor() - atomic bitwise XOR with relaxed ordering 1070 + * @i: s64 value 1071 + * @v: pointer to atomic64_t 1072 + * 1073 + * Atomically updates @v to (@v ^ @i) with relaxed ordering. 1074 + * 1075 + * Unsafe to use in noinstr code; use raw_atomic64_xor() there. 1076 + * 1077 + * Return: Nothing. 1078 + */ 2576 1079 static __always_inline void 2577 1080 atomic64_xor(s64 i, atomic64_t *v) 2578 1081 { ··· 2591 1072 raw_atomic64_xor(i, v); 2592 1073 } 2593 1074 1075 + /** 1076 + * atomic64_fetch_xor() - atomic bitwise XOR with full ordering 1077 + * @i: s64 value 1078 + * @v: pointer to atomic64_t 1079 + * 1080 + * Atomically updates @v to (@v ^ @i) with full ordering. 1081 + * 1082 + * Unsafe to use in noinstr code; use raw_atomic64_fetch_xor() there. 1083 + * 1084 + * Return: The original value of @v. 1085 + */ 2594 1086 static __always_inline s64 2595 1087 atomic64_fetch_xor(s64 i, atomic64_t *v) 2596 1088 { ··· 2610 1080 return raw_atomic64_fetch_xor(i, v); 2611 1081 } 2612 1082 1083 + /** 1084 + * atomic64_fetch_xor_acquire() - atomic bitwise XOR with acquire ordering 1085 + * @i: s64 value 1086 + * @v: pointer to atomic64_t 1087 + * 1088 + * Atomically updates @v to (@v ^ @i) with acquire ordering. 1089 + * 1090 + * Unsafe to use in noinstr code; use raw_atomic64_fetch_xor_acquire() there. 1091 + * 1092 + * Return: The original value of @v. 1093 + */ 2613 1094 static __always_inline s64 2614 1095 atomic64_fetch_xor_acquire(s64 i, atomic64_t *v) 2615 1096 { ··· 2628 1087 return raw_atomic64_fetch_xor_acquire(i, v); 2629 1088 } 2630 1089 1090 + /** 1091 + * atomic64_fetch_xor_release() - atomic bitwise XOR with release ordering 1092 + * @i: s64 value 1093 + * @v: pointer to atomic64_t 1094 + * 1095 + * Atomically updates @v to (@v ^ @i) with release ordering. 1096 + * 1097 + * Unsafe to use in noinstr code; use raw_atomic64_fetch_xor_release() there. 1098 + * 1099 + * Return: The original value of @v. 1100 + */ 2631 1101 static __always_inline s64 2632 1102 atomic64_fetch_xor_release(s64 i, atomic64_t *v) 2633 1103 { ··· 2647 1095 return raw_atomic64_fetch_xor_release(i, v); 2648 1096 } 2649 1097 1098 + /** 1099 + * atomic64_fetch_xor_relaxed() - atomic bitwise XOR with relaxed ordering 1100 + * @i: s64 value 1101 + * @v: pointer to atomic64_t 1102 + * 1103 + * Atomically updates @v to (@v ^ @i) with relaxed ordering. 1104 + * 1105 + * Unsafe to use in noinstr code; use raw_atomic64_fetch_xor_relaxed() there. 1106 + * 1107 + * Return: The original value of @v. 1108 + */ 2650 1109 static __always_inline s64 2651 1110 atomic64_fetch_xor_relaxed(s64 i, atomic64_t *v) 2652 1111 { ··· 2665 1102 return raw_atomic64_fetch_xor_relaxed(i, v); 2666 1103 } 2667 1104 1105 + /** 1106 + * atomic64_xchg() - atomic exchange with full ordering 1107 + * @v: pointer to atomic64_t 1108 + * @new: s64 value to assign 1109 + * 1110 + * Atomically updates @v to @new with full ordering. 1111 + * 1112 + * Unsafe to use in noinstr code; use raw_atomic64_xchg() there. 1113 + * 1114 + * Return: The original value of @v. 1115 + */ 2668 1116 static __always_inline s64 2669 1117 atomic64_xchg(atomic64_t *v, s64 new) 2670 1118 { ··· 2684 1110 return raw_atomic64_xchg(v, new); 2685 1111 } 2686 1112 1113 + /** 1114 + * atomic64_xchg_acquire() - atomic exchange with acquire ordering 1115 + * @v: pointer to atomic64_t 1116 + * @new: s64 value to assign 1117 + * 1118 + * Atomically updates @v to @new with acquire ordering. 1119 + * 1120 + * Unsafe to use in noinstr code; use raw_atomic64_xchg_acquire() there. 1121 + * 1122 + * Return: The original value of @v. 1123 + */ 2687 1124 static __always_inline s64 2688 1125 atomic64_xchg_acquire(atomic64_t *v, s64 new) 2689 1126 { ··· 2702 1117 return raw_atomic64_xchg_acquire(v, new); 2703 1118 } 2704 1119 1120 + /** 1121 + * atomic64_xchg_release() - atomic exchange with release ordering 1122 + * @v: pointer to atomic64_t 1123 + * @new: s64 value to assign 1124 + * 1125 + * Atomically updates @v to @new with release ordering. 1126 + * 1127 + * Unsafe to use in noinstr code; use raw_atomic64_xchg_release() there. 1128 + * 1129 + * Return: The original value of @v. 1130 + */ 2705 1131 static __always_inline s64 2706 1132 atomic64_xchg_release(atomic64_t *v, s64 new) 2707 1133 { ··· 2721 1125 return raw_atomic64_xchg_release(v, new); 2722 1126 } 2723 1127 1128 + /** 1129 + * atomic64_xchg_relaxed() - atomic exchange with relaxed ordering 1130 + * @v: pointer to atomic64_t 1131 + * @new: s64 value to assign 1132 + * 1133 + * Atomically updates @v to @new with relaxed ordering. 1134 + * 1135 + * Unsafe to use in noinstr code; use raw_atomic64_xchg_relaxed() there. 1136 + * 1137 + * Return: The original value of @v. 1138 + */ 2724 1139 static __always_inline s64 2725 1140 atomic64_xchg_relaxed(atomic64_t *v, s64 new) 2726 1141 { ··· 2739 1132 return raw_atomic64_xchg_relaxed(v, new); 2740 1133 } 2741 1134 1135 + /** 1136 + * atomic64_cmpxchg() - atomic compare and exchange with full ordering 1137 + * @v: pointer to atomic64_t 1138 + * @old: s64 value to compare with 1139 + * @new: s64 value to assign 1140 + * 1141 + * If (@v == @old), atomically updates @v to @new with full ordering. 1142 + * 1143 + * Unsafe to use in noinstr code; use raw_atomic64_cmpxchg() there. 1144 + * 1145 + * Return: The original value of @v. 1146 + */ 2742 1147 static __always_inline s64 2743 1148 atomic64_cmpxchg(atomic64_t *v, s64 old, s64 new) 2744 1149 { ··· 2759 1140 return raw_atomic64_cmpxchg(v, old, new); 2760 1141 } 2761 1142 1143 + /** 1144 + * atomic64_cmpxchg_acquire() - atomic compare and exchange with acquire ordering 1145 + * @v: pointer to atomic64_t 1146 + * @old: s64 value to compare with 1147 + * @new: s64 value to assign 1148 + * 1149 + * If (@v == @old), atomically updates @v to @new with acquire ordering. 1150 + * 1151 + * Unsafe to use in noinstr code; use raw_atomic64_cmpxchg_acquire() there. 1152 + * 1153 + * Return: The original value of @v. 1154 + */ 2762 1155 static __always_inline s64 2763 1156 atomic64_cmpxchg_acquire(atomic64_t *v, s64 old, s64 new) 2764 1157 { ··· 2778 1147 return raw_atomic64_cmpxchg_acquire(v, old, new); 2779 1148 } 2780 1149 1150 + /** 1151 + * atomic64_cmpxchg_release() - atomic compare and exchange with release ordering 1152 + * @v: pointer to atomic64_t 1153 + * @old: s64 value to compare with 1154 + * @new: s64 value to assign 1155 + * 1156 + * If (@v == @old), atomically updates @v to @new with release ordering. 1157 + * 1158 + * Unsafe to use in noinstr code; use raw_atomic64_cmpxchg_release() there. 1159 + * 1160 + * Return: The original value of @v. 1161 + */ 2781 1162 static __always_inline s64 2782 1163 atomic64_cmpxchg_release(atomic64_t *v, s64 old, s64 new) 2783 1164 { ··· 2798 1155 return raw_atomic64_cmpxchg_release(v, old, new); 2799 1156 } 2800 1157 1158 + /** 1159 + * atomic64_cmpxchg_relaxed() - atomic compare and exchange with relaxed ordering 1160 + * @v: pointer to atomic64_t 1161 + * @old: s64 value to compare with 1162 + * @new: s64 value to assign 1163 + * 1164 + * If (@v == @old), atomically updates @v to @new with relaxed ordering. 1165 + * 1166 + * Unsafe to use in noinstr code; use raw_atomic64_cmpxchg_relaxed() there. 1167 + * 1168 + * Return: The original value of @v. 1169 + */ 2801 1170 static __always_inline s64 2802 1171 atomic64_cmpxchg_relaxed(atomic64_t *v, s64 old, s64 new) 2803 1172 { ··· 2817 1162 return raw_atomic64_cmpxchg_relaxed(v, old, new); 2818 1163 } 2819 1164 1165 + /** 1166 + * atomic64_try_cmpxchg() - atomic compare and exchange with full ordering 1167 + * @v: pointer to atomic64_t 1168 + * @old: pointer to s64 value to compare with 1169 + * @new: s64 value to assign 1170 + * 1171 + * If (@v == @old), atomically updates @v to @new with full ordering. 1172 + * Otherwise, updates @old to the current value of @v. 1173 + * 1174 + * Unsafe to use in noinstr code; use raw_atomic64_try_cmpxchg() there. 1175 + * 1176 + * Return: @true if the exchange occured, @false otherwise. 1177 + */ 2820 1178 static __always_inline bool 2821 1179 atomic64_try_cmpxchg(atomic64_t *v, s64 *old, s64 new) 2822 1180 { ··· 2839 1171 return raw_atomic64_try_cmpxchg(v, old, new); 2840 1172 } 2841 1173 1174 + /** 1175 + * atomic64_try_cmpxchg_acquire() - atomic compare and exchange with acquire ordering 1176 + * @v: pointer to atomic64_t 1177 + * @old: pointer to s64 value to compare with 1178 + * @new: s64 value to assign 1179 + * 1180 + * If (@v == @old), atomically updates @v to @new with acquire ordering. 1181 + * Otherwise, updates @old to the current value of @v. 1182 + * 1183 + * Unsafe to use in noinstr code; use raw_atomic64_try_cmpxchg_acquire() there. 1184 + * 1185 + * Return: @true if the exchange occured, @false otherwise. 1186 + */ 2842 1187 static __always_inline bool 2843 1188 atomic64_try_cmpxchg_acquire(atomic64_t *v, s64 *old, s64 new) 2844 1189 { ··· 2860 1179 return raw_atomic64_try_cmpxchg_acquire(v, old, new); 2861 1180 } 2862 1181 1182 + /** 1183 + * atomic64_try_cmpxchg_release() - atomic compare and exchange with release ordering 1184 + * @v: pointer to atomic64_t 1185 + * @old: pointer to s64 value to compare with 1186 + * @new: s64 value to assign 1187 + * 1188 + * If (@v == @old), atomically updates @v to @new with release ordering. 1189 + * Otherwise, updates @old to the current value of @v. 1190 + * 1191 + * Unsafe to use in noinstr code; use raw_atomic64_try_cmpxchg_release() there. 1192 + * 1193 + * Return: @true if the exchange occured, @false otherwise. 1194 + */ 2863 1195 static __always_inline bool 2864 1196 atomic64_try_cmpxchg_release(atomic64_t *v, s64 *old, s64 new) 2865 1197 { ··· 2882 1188 return raw_atomic64_try_cmpxchg_release(v, old, new); 2883 1189 } 2884 1190 1191 + /** 1192 + * atomic64_try_cmpxchg_relaxed() - atomic compare and exchange with relaxed ordering 1193 + * @v: pointer to atomic64_t 1194 + * @old: pointer to s64 value to compare with 1195 + * @new: s64 value to assign 1196 + * 1197 + * If (@v == @old), atomically updates @v to @new with relaxed ordering. 1198 + * Otherwise, updates @old to the current value of @v. 1199 + * 1200 + * Unsafe to use in noinstr code; use raw_atomic64_try_cmpxchg_relaxed() there. 1201 + * 1202 + * Return: @true if the exchange occured, @false otherwise. 1203 + */ 2885 1204 static __always_inline bool 2886 1205 atomic64_try_cmpxchg_relaxed(atomic64_t *v, s64 *old, s64 new) 2887 1206 { ··· 2903 1196 return raw_atomic64_try_cmpxchg_relaxed(v, old, new); 2904 1197 } 2905 1198 1199 + /** 1200 + * atomic64_sub_and_test() - atomic subtract and test if zero with full ordering 1201 + * @i: s64 value to add 1202 + * @v: pointer to atomic64_t 1203 + * 1204 + * Atomically updates @v to (@v - @i) with full ordering. 1205 + * 1206 + * Unsafe to use in noinstr code; use raw_atomic64_sub_and_test() there. 1207 + * 1208 + * Return: @true if the resulting value of @v is zero, @false otherwise. 1209 + */ 2906 1210 static __always_inline bool 2907 1211 atomic64_sub_and_test(s64 i, atomic64_t *v) 2908 1212 { ··· 2922 1204 return raw_atomic64_sub_and_test(i, v); 2923 1205 } 2924 1206 1207 + /** 1208 + * atomic64_dec_and_test() - atomic decrement and test if zero with full ordering 1209 + * @v: pointer to atomic64_t 1210 + * 1211 + * Atomically updates @v to (@v - 1) with full ordering. 1212 + * 1213 + * Unsafe to use in noinstr code; use raw_atomic64_dec_and_test() there. 1214 + * 1215 + * Return: @true if the resulting value of @v is zero, @false otherwise. 1216 + */ 2925 1217 static __always_inline bool 2926 1218 atomic64_dec_and_test(atomic64_t *v) 2927 1219 { ··· 2940 1212 return raw_atomic64_dec_and_test(v); 2941 1213 } 2942 1214 1215 + /** 1216 + * atomic64_inc_and_test() - atomic increment and test if zero with full ordering 1217 + * @v: pointer to atomic64_t 1218 + * 1219 + * Atomically updates @v to (@v + 1) with full ordering. 1220 + * 1221 + * Unsafe to use in noinstr code; use raw_atomic64_inc_and_test() there. 1222 + * 1223 + * Return: @true if the resulting value of @v is zero, @false otherwise. 1224 + */ 2943 1225 static __always_inline bool 2944 1226 atomic64_inc_and_test(atomic64_t *v) 2945 1227 { ··· 2958 1220 return raw_atomic64_inc_and_test(v); 2959 1221 } 2960 1222 1223 + /** 1224 + * atomic64_add_negative() - atomic add and test if negative with full ordering 1225 + * @i: s64 value to add 1226 + * @v: pointer to atomic64_t 1227 + * 1228 + * Atomically updates @v to (@v + @i) with full ordering. 1229 + * 1230 + * Unsafe to use in noinstr code; use raw_atomic64_add_negative() there. 1231 + * 1232 + * Return: @true if the resulting value of @v is negative, @false otherwise. 1233 + */ 2961 1234 static __always_inline bool 2962 1235 atomic64_add_negative(s64 i, atomic64_t *v) 2963 1236 { ··· 2977 1228 return raw_atomic64_add_negative(i, v); 2978 1229 } 2979 1230 1231 + /** 1232 + * atomic64_add_negative_acquire() - atomic add and test if negative with acquire ordering 1233 + * @i: s64 value to add 1234 + * @v: pointer to atomic64_t 1235 + * 1236 + * Atomically updates @v to (@v + @i) with acquire ordering. 1237 + * 1238 + * Unsafe to use in noinstr code; use raw_atomic64_add_negative_acquire() there. 1239 + * 1240 + * Return: @true if the resulting value of @v is negative, @false otherwise. 1241 + */ 2980 1242 static __always_inline bool 2981 1243 atomic64_add_negative_acquire(s64 i, atomic64_t *v) 2982 1244 { ··· 2995 1235 return raw_atomic64_add_negative_acquire(i, v); 2996 1236 } 2997 1237 1238 + /** 1239 + * atomic64_add_negative_release() - atomic add and test if negative with release ordering 1240 + * @i: s64 value to add 1241 + * @v: pointer to atomic64_t 1242 + * 1243 + * Atomically updates @v to (@v + @i) with release ordering. 1244 + * 1245 + * Unsafe to use in noinstr code; use raw_atomic64_add_negative_release() there. 1246 + * 1247 + * Return: @true if the resulting value of @v is negative, @false otherwise. 1248 + */ 2998 1249 static __always_inline bool 2999 1250 atomic64_add_negative_release(s64 i, atomic64_t *v) 3000 1251 { ··· 3014 1243 return raw_atomic64_add_negative_release(i, v); 3015 1244 } 3016 1245 1246 + /** 1247 + * atomic64_add_negative_relaxed() - atomic add and test if negative with relaxed ordering 1248 + * @i: s64 value to add 1249 + * @v: pointer to atomic64_t 1250 + * 1251 + * Atomically updates @v to (@v + @i) with relaxed ordering. 1252 + * 1253 + * Unsafe to use in noinstr code; use raw_atomic64_add_negative_relaxed() there. 1254 + * 1255 + * Return: @true if the resulting value of @v is negative, @false otherwise. 1256 + */ 3017 1257 static __always_inline bool 3018 1258 atomic64_add_negative_relaxed(s64 i, atomic64_t *v) 3019 1259 { ··· 3032 1250 return raw_atomic64_add_negative_relaxed(i, v); 3033 1251 } 3034 1252 1253 + /** 1254 + * atomic64_fetch_add_unless() - atomic add unless value with full ordering 1255 + * @v: pointer to atomic64_t 1256 + * @a: s64 value to add 1257 + * @u: s64 value to compare with 1258 + * 1259 + * If (@v != @u), atomically updates @v to (@v + @a) with full ordering. 1260 + * 1261 + * Unsafe to use in noinstr code; use raw_atomic64_fetch_add_unless() there. 1262 + * 1263 + * Return: The original value of @v. 1264 + */ 3035 1265 static __always_inline s64 3036 1266 atomic64_fetch_add_unless(atomic64_t *v, s64 a, s64 u) 3037 1267 { ··· 3052 1258 return raw_atomic64_fetch_add_unless(v, a, u); 3053 1259 } 3054 1260 1261 + /** 1262 + * atomic64_add_unless() - atomic add unless value with full ordering 1263 + * @v: pointer to atomic64_t 1264 + * @a: s64 value to add 1265 + * @u: s64 value to compare with 1266 + * 1267 + * If (@v != @u), atomically updates @v to (@v + @a) with full ordering. 1268 + * 1269 + * Unsafe to use in noinstr code; use raw_atomic64_add_unless() there. 1270 + * 1271 + * Return: @true if @v was updated, @false otherwise. 1272 + */ 3055 1273 static __always_inline bool 3056 1274 atomic64_add_unless(atomic64_t *v, s64 a, s64 u) 3057 1275 { ··· 3072 1266 return raw_atomic64_add_unless(v, a, u); 3073 1267 } 3074 1268 1269 + /** 1270 + * atomic64_inc_not_zero() - atomic increment unless zero with full ordering 1271 + * @v: pointer to atomic64_t 1272 + * 1273 + * If (@v != 0), atomically updates @v to (@v + 1) with full ordering. 1274 + * 1275 + * Unsafe to use in noinstr code; use raw_atomic64_inc_not_zero() there. 1276 + * 1277 + * Return: @true if @v was updated, @false otherwise. 1278 + */ 3075 1279 static __always_inline bool 3076 1280 atomic64_inc_not_zero(atomic64_t *v) 3077 1281 { ··· 3090 1274 return raw_atomic64_inc_not_zero(v); 3091 1275 } 3092 1276 1277 + /** 1278 + * atomic64_inc_unless_negative() - atomic increment unless negative with full ordering 1279 + * @v: pointer to atomic64_t 1280 + * 1281 + * If (@v >= 0), atomically updates @v to (@v + 1) with full ordering. 1282 + * 1283 + * Unsafe to use in noinstr code; use raw_atomic64_inc_unless_negative() there. 1284 + * 1285 + * Return: @true if @v was updated, @false otherwise. 1286 + */ 3093 1287 static __always_inline bool 3094 1288 atomic64_inc_unless_negative(atomic64_t *v) 3095 1289 { ··· 3108 1282 return raw_atomic64_inc_unless_negative(v); 3109 1283 } 3110 1284 1285 + /** 1286 + * atomic64_dec_unless_positive() - atomic decrement unless positive with full ordering 1287 + * @v: pointer to atomic64_t 1288 + * 1289 + * If (@v <= 0), atomically updates @v to (@v - 1) with full ordering. 1290 + * 1291 + * Unsafe to use in noinstr code; use raw_atomic64_dec_unless_positive() there. 1292 + * 1293 + * Return: @true if @v was updated, @false otherwise. 1294 + */ 3111 1295 static __always_inline bool 3112 1296 atomic64_dec_unless_positive(atomic64_t *v) 3113 1297 { ··· 3126 1290 return raw_atomic64_dec_unless_positive(v); 3127 1291 } 3128 1292 1293 + /** 1294 + * atomic64_dec_if_positive() - atomic decrement if positive with full ordering 1295 + * @v: pointer to atomic64_t 1296 + * 1297 + * If (@v > 0), atomically updates @v to (@v - 1) with full ordering. 1298 + * 1299 + * Unsafe to use in noinstr code; use raw_atomic64_dec_if_positive() there. 1300 + * 1301 + * Return: @true if @v was updated, @false otherwise. 1302 + */ 3129 1303 static __always_inline s64 3130 1304 atomic64_dec_if_positive(atomic64_t *v) 3131 1305 { ··· 3144 1298 return raw_atomic64_dec_if_positive(v); 3145 1299 } 3146 1300 1301 + /** 1302 + * atomic_long_read() - atomic load with relaxed ordering 1303 + * @v: pointer to atomic_long_t 1304 + * 1305 + * Atomically loads the value of @v with relaxed ordering. 1306 + * 1307 + * Unsafe to use in noinstr code; use raw_atomic_long_read() there. 1308 + * 1309 + * Return: The value loaded from @v. 1310 + */ 3147 1311 static __always_inline long 3148 1312 atomic_long_read(const atomic_long_t *v) 3149 1313 { ··· 3161 1305 return raw_atomic_long_read(v); 3162 1306 } 3163 1307 1308 + /** 1309 + * atomic_long_read_acquire() - atomic load with acquire ordering 1310 + * @v: pointer to atomic_long_t 1311 + * 1312 + * Atomically loads the value of @v with acquire ordering. 1313 + * 1314 + * Unsafe to use in noinstr code; use raw_atomic_long_read_acquire() there. 1315 + * 1316 + * Return: The value loaded from @v. 1317 + */ 3164 1318 static __always_inline long 3165 1319 atomic_long_read_acquire(const atomic_long_t *v) 3166 1320 { ··· 3178 1312 return raw_atomic_long_read_acquire(v); 3179 1313 } 3180 1314 1315 + /** 1316 + * atomic_long_set() - atomic set with relaxed ordering 1317 + * @v: pointer to atomic_long_t 1318 + * @i: long value to assign 1319 + * 1320 + * Atomically sets @v to @i with relaxed ordering. 1321 + * 1322 + * Unsafe to use in noinstr code; use raw_atomic_long_set() there. 1323 + * 1324 + * Return: Nothing. 1325 + */ 3181 1326 static __always_inline void 3182 1327 atomic_long_set(atomic_long_t *v, long i) 3183 1328 { ··· 3196 1319 raw_atomic_long_set(v, i); 3197 1320 } 3198 1321 1322 + /** 1323 + * atomic_long_set_release() - atomic set with release ordering 1324 + * @v: pointer to atomic_long_t 1325 + * @i: long value to assign 1326 + * 1327 + * Atomically sets @v to @i with release ordering. 1328 + * 1329 + * Unsafe to use in noinstr code; use raw_atomic_long_set_release() there. 1330 + * 1331 + * Return: Nothing. 1332 + */ 3199 1333 static __always_inline void 3200 1334 atomic_long_set_release(atomic_long_t *v, long i) 3201 1335 { ··· 3215 1327 raw_atomic_long_set_release(v, i); 3216 1328 } 3217 1329 1330 + /** 1331 + * atomic_long_add() - atomic add with relaxed ordering 1332 + * @i: long value to add 1333 + * @v: pointer to atomic_long_t 1334 + * 1335 + * Atomically updates @v to (@v + @i) with relaxed ordering. 1336 + * 1337 + * Unsafe to use in noinstr code; use raw_atomic_long_add() there. 1338 + * 1339 + * Return: Nothing. 1340 + */ 3218 1341 static __always_inline void 3219 1342 atomic_long_add(long i, atomic_long_t *v) 3220 1343 { ··· 3233 1334 raw_atomic_long_add(i, v); 3234 1335 } 3235 1336 1337 + /** 1338 + * atomic_long_add_return() - atomic add with full ordering 1339 + * @i: long value to add 1340 + * @v: pointer to atomic_long_t 1341 + * 1342 + * Atomically updates @v to (@v + @i) with full ordering. 1343 + * 1344 + * Unsafe to use in noinstr code; use raw_atomic_long_add_return() there. 1345 + * 1346 + * Return: The updated value of @v. 1347 + */ 3236 1348 static __always_inline long 3237 1349 atomic_long_add_return(long i, atomic_long_t *v) 3238 1350 { ··· 3252 1342 return raw_atomic_long_add_return(i, v); 3253 1343 } 3254 1344 1345 + /** 1346 + * atomic_long_add_return_acquire() - atomic add with acquire ordering 1347 + * @i: long value to add 1348 + * @v: pointer to atomic_long_t 1349 + * 1350 + * Atomically updates @v to (@v + @i) with acquire ordering. 1351 + * 1352 + * Unsafe to use in noinstr code; use raw_atomic_long_add_return_acquire() there. 1353 + * 1354 + * Return: The updated value of @v. 1355 + */ 3255 1356 static __always_inline long 3256 1357 atomic_long_add_return_acquire(long i, atomic_long_t *v) 3257 1358 { ··· 3270 1349 return raw_atomic_long_add_return_acquire(i, v); 3271 1350 } 3272 1351 1352 + /** 1353 + * atomic_long_add_return_release() - atomic add with release ordering 1354 + * @i: long value to add 1355 + * @v: pointer to atomic_long_t 1356 + * 1357 + * Atomically updates @v to (@v + @i) with release ordering. 1358 + * 1359 + * Unsafe to use in noinstr code; use raw_atomic_long_add_return_release() there. 1360 + * 1361 + * Return: The updated value of @v. 1362 + */ 3273 1363 static __always_inline long 3274 1364 atomic_long_add_return_release(long i, atomic_long_t *v) 3275 1365 { ··· 3289 1357 return raw_atomic_long_add_return_release(i, v); 3290 1358 } 3291 1359 1360 + /** 1361 + * atomic_long_add_return_relaxed() - atomic add with relaxed ordering 1362 + * @i: long value to add 1363 + * @v: pointer to atomic_long_t 1364 + * 1365 + * Atomically updates @v to (@v + @i) with relaxed ordering. 1366 + * 1367 + * Unsafe to use in noinstr code; use raw_atomic_long_add_return_relaxed() there. 1368 + * 1369 + * Return: The updated value of @v. 1370 + */ 3292 1371 static __always_inline long 3293 1372 atomic_long_add_return_relaxed(long i, atomic_long_t *v) 3294 1373 { ··· 3307 1364 return raw_atomic_long_add_return_relaxed(i, v); 3308 1365 } 3309 1366 1367 + /** 1368 + * atomic_long_fetch_add() - atomic add with full ordering 1369 + * @i: long value to add 1370 + * @v: pointer to atomic_long_t 1371 + * 1372 + * Atomically updates @v to (@v + @i) with full ordering. 1373 + * 1374 + * Unsafe to use in noinstr code; use raw_atomic_long_fetch_add() there. 1375 + * 1376 + * Return: The original value of @v. 1377 + */ 3310 1378 static __always_inline long 3311 1379 atomic_long_fetch_add(long i, atomic_long_t *v) 3312 1380 { ··· 3326 1372 return raw_atomic_long_fetch_add(i, v); 3327 1373 } 3328 1374 1375 + /** 1376 + * atomic_long_fetch_add_acquire() - atomic add with acquire ordering 1377 + * @i: long value to add 1378 + * @v: pointer to atomic_long_t 1379 + * 1380 + * Atomically updates @v to (@v + @i) with acquire ordering. 1381 + * 1382 + * Unsafe to use in noinstr code; use raw_atomic_long_fetch_add_acquire() there. 1383 + * 1384 + * Return: The original value of @v. 1385 + */ 3329 1386 static __always_inline long 3330 1387 atomic_long_fetch_add_acquire(long i, atomic_long_t *v) 3331 1388 { ··· 3344 1379 return raw_atomic_long_fetch_add_acquire(i, v); 3345 1380 } 3346 1381 1382 + /** 1383 + * atomic_long_fetch_add_release() - atomic add with release ordering 1384 + * @i: long value to add 1385 + * @v: pointer to atomic_long_t 1386 + * 1387 + * Atomically updates @v to (@v + @i) with release ordering. 1388 + * 1389 + * Unsafe to use in noinstr code; use raw_atomic_long_fetch_add_release() there. 1390 + * 1391 + * Return: The original value of @v. 1392 + */ 3347 1393 static __always_inline long 3348 1394 atomic_long_fetch_add_release(long i, atomic_long_t *v) 3349 1395 { ··· 3363 1387 return raw_atomic_long_fetch_add_release(i, v); 3364 1388 } 3365 1389 1390 + /** 1391 + * atomic_long_fetch_add_relaxed() - atomic add with relaxed ordering 1392 + * @i: long value to add 1393 + * @v: pointer to atomic_long_t 1394 + * 1395 + * Atomically updates @v to (@v + @i) with relaxed ordering. 1396 + * 1397 + * Unsafe to use in noinstr code; use raw_atomic_long_fetch_add_relaxed() there. 1398 + * 1399 + * Return: The original value of @v. 1400 + */ 3366 1401 static __always_inline long 3367 1402 atomic_long_fetch_add_relaxed(long i, atomic_long_t *v) 3368 1403 { ··· 3381 1394 return raw_atomic_long_fetch_add_relaxed(i, v); 3382 1395 } 3383 1396 1397 + /** 1398 + * atomic_long_sub() - atomic subtract with relaxed ordering 1399 + * @i: long value to subtract 1400 + * @v: pointer to atomic_long_t 1401 + * 1402 + * Atomically updates @v to (@v - @i) with relaxed ordering. 1403 + * 1404 + * Unsafe to use in noinstr code; use raw_atomic_long_sub() there. 1405 + * 1406 + * Return: Nothing. 1407 + */ 3384 1408 static __always_inline void 3385 1409 atomic_long_sub(long i, atomic_long_t *v) 3386 1410 { ··· 3399 1401 raw_atomic_long_sub(i, v); 3400 1402 } 3401 1403 1404 + /** 1405 + * atomic_long_sub_return() - atomic subtract with full ordering 1406 + * @i: long value to subtract 1407 + * @v: pointer to atomic_long_t 1408 + * 1409 + * Atomically updates @v to (@v - @i) with full ordering. 1410 + * 1411 + * Unsafe to use in noinstr code; use raw_atomic_long_sub_return() there. 1412 + * 1413 + * Return: The updated value of @v. 1414 + */ 3402 1415 static __always_inline long 3403 1416 atomic_long_sub_return(long i, atomic_long_t *v) 3404 1417 { ··· 3418 1409 return raw_atomic_long_sub_return(i, v); 3419 1410 } 3420 1411 1412 + /** 1413 + * atomic_long_sub_return_acquire() - atomic subtract with acquire ordering 1414 + * @i: long value to subtract 1415 + * @v: pointer to atomic_long_t 1416 + * 1417 + * Atomically updates @v to (@v - @i) with acquire ordering. 1418 + * 1419 + * Unsafe to use in noinstr code; use raw_atomic_long_sub_return_acquire() there. 1420 + * 1421 + * Return: The updated value of @v. 1422 + */ 3421 1423 static __always_inline long 3422 1424 atomic_long_sub_return_acquire(long i, atomic_long_t *v) 3423 1425 { ··· 3436 1416 return raw_atomic_long_sub_return_acquire(i, v); 3437 1417 } 3438 1418 1419 + /** 1420 + * atomic_long_sub_return_release() - atomic subtract with release ordering 1421 + * @i: long value to subtract 1422 + * @v: pointer to atomic_long_t 1423 + * 1424 + * Atomically updates @v to (@v - @i) with release ordering. 1425 + * 1426 + * Unsafe to use in noinstr code; use raw_atomic_long_sub_return_release() there. 1427 + * 1428 + * Return: The updated value of @v. 1429 + */ 3439 1430 static __always_inline long 3440 1431 atomic_long_sub_return_release(long i, atomic_long_t *v) 3441 1432 { ··· 3455 1424 return raw_atomic_long_sub_return_release(i, v); 3456 1425 } 3457 1426 1427 + /** 1428 + * atomic_long_sub_return_relaxed() - atomic subtract with relaxed ordering 1429 + * @i: long value to subtract 1430 + * @v: pointer to atomic_long_t 1431 + * 1432 + * Atomically updates @v to (@v - @i) with relaxed ordering. 1433 + * 1434 + * Unsafe to use in noinstr code; use raw_atomic_long_sub_return_relaxed() there. 1435 + * 1436 + * Return: The updated value of @v. 1437 + */ 3458 1438 static __always_inline long 3459 1439 atomic_long_sub_return_relaxed(long i, atomic_long_t *v) 3460 1440 { ··· 3473 1431 return raw_atomic_long_sub_return_relaxed(i, v); 3474 1432 } 3475 1433 1434 + /** 1435 + * atomic_long_fetch_sub() - atomic subtract with full ordering 1436 + * @i: long value to subtract 1437 + * @v: pointer to atomic_long_t 1438 + * 1439 + * Atomically updates @v to (@v - @i) with full ordering. 1440 + * 1441 + * Unsafe to use in noinstr code; use raw_atomic_long_fetch_sub() there. 1442 + * 1443 + * Return: The original value of @v. 1444 + */ 3476 1445 static __always_inline long 3477 1446 atomic_long_fetch_sub(long i, atomic_long_t *v) 3478 1447 { ··· 3492 1439 return raw_atomic_long_fetch_sub(i, v); 3493 1440 } 3494 1441 1442 + /** 1443 + * atomic_long_fetch_sub_acquire() - atomic subtract with acquire ordering 1444 + * @i: long value to subtract 1445 + * @v: pointer to atomic_long_t 1446 + * 1447 + * Atomically updates @v to (@v - @i) with acquire ordering. 1448 + * 1449 + * Unsafe to use in noinstr code; use raw_atomic_long_fetch_sub_acquire() there. 1450 + * 1451 + * Return: The original value of @v. 1452 + */ 3495 1453 static __always_inline long 3496 1454 atomic_long_fetch_sub_acquire(long i, atomic_long_t *v) 3497 1455 { ··· 3510 1446 return raw_atomic_long_fetch_sub_acquire(i, v); 3511 1447 } 3512 1448 1449 + /** 1450 + * atomic_long_fetch_sub_release() - atomic subtract with release ordering 1451 + * @i: long value to subtract 1452 + * @v: pointer to atomic_long_t 1453 + * 1454 + * Atomically updates @v to (@v - @i) with release ordering. 1455 + * 1456 + * Unsafe to use in noinstr code; use raw_atomic_long_fetch_sub_release() there. 1457 + * 1458 + * Return: The original value of @v. 1459 + */ 3513 1460 static __always_inline long 3514 1461 atomic_long_fetch_sub_release(long i, atomic_long_t *v) 3515 1462 { ··· 3529 1454 return raw_atomic_long_fetch_sub_release(i, v); 3530 1455 } 3531 1456 1457 + /** 1458 + * atomic_long_fetch_sub_relaxed() - atomic subtract with relaxed ordering 1459 + * @i: long value to subtract 1460 + * @v: pointer to atomic_long_t 1461 + * 1462 + * Atomically updates @v to (@v - @i) with relaxed ordering. 1463 + * 1464 + * Unsafe to use in noinstr code; use raw_atomic_long_fetch_sub_relaxed() there. 1465 + * 1466 + * Return: The original value of @v. 1467 + */ 3532 1468 static __always_inline long 3533 1469 atomic_long_fetch_sub_relaxed(long i, atomic_long_t *v) 3534 1470 { ··· 3547 1461 return raw_atomic_long_fetch_sub_relaxed(i, v); 3548 1462 } 3549 1463 1464 + /** 1465 + * atomic_long_inc() - atomic increment with relaxed ordering 1466 + * @v: pointer to atomic_long_t 1467 + * 1468 + * Atomically updates @v to (@v + 1) with relaxed ordering. 1469 + * 1470 + * Unsafe to use in noinstr code; use raw_atomic_long_inc() there. 1471 + * 1472 + * Return: Nothing. 1473 + */ 3550 1474 static __always_inline void 3551 1475 atomic_long_inc(atomic_long_t *v) 3552 1476 { ··· 3564 1468 raw_atomic_long_inc(v); 3565 1469 } 3566 1470 1471 + /** 1472 + * atomic_long_inc_return() - atomic increment with full ordering 1473 + * @v: pointer to atomic_long_t 1474 + * 1475 + * Atomically updates @v to (@v + 1) with full ordering. 1476 + * 1477 + * Unsafe to use in noinstr code; use raw_atomic_long_inc_return() there. 1478 + * 1479 + * Return: The updated value of @v. 1480 + */ 3567 1481 static __always_inline long 3568 1482 atomic_long_inc_return(atomic_long_t *v) 3569 1483 { ··· 3582 1476 return raw_atomic_long_inc_return(v); 3583 1477 } 3584 1478 1479 + /** 1480 + * atomic_long_inc_return_acquire() - atomic increment with acquire ordering 1481 + * @v: pointer to atomic_long_t 1482 + * 1483 + * Atomically updates @v to (@v + 1) with acquire ordering. 1484 + * 1485 + * Unsafe to use in noinstr code; use raw_atomic_long_inc_return_acquire() there. 1486 + * 1487 + * Return: The updated value of @v. 1488 + */ 3585 1489 static __always_inline long 3586 1490 atomic_long_inc_return_acquire(atomic_long_t *v) 3587 1491 { ··· 3599 1483 return raw_atomic_long_inc_return_acquire(v); 3600 1484 } 3601 1485 1486 + /** 1487 + * atomic_long_inc_return_release() - atomic increment with release ordering 1488 + * @v: pointer to atomic_long_t 1489 + * 1490 + * Atomically updates @v to (@v + 1) with release ordering. 1491 + * 1492 + * Unsafe to use in noinstr code; use raw_atomic_long_inc_return_release() there. 1493 + * 1494 + * Return: The updated value of @v. 1495 + */ 3602 1496 static __always_inline long 3603 1497 atomic_long_inc_return_release(atomic_long_t *v) 3604 1498 { ··· 3617 1491 return raw_atomic_long_inc_return_release(v); 3618 1492 } 3619 1493 1494 + /** 1495 + * atomic_long_inc_return_relaxed() - atomic increment with relaxed ordering 1496 + * @v: pointer to atomic_long_t 1497 + * 1498 + * Atomically updates @v to (@v + 1) with relaxed ordering. 1499 + * 1500 + * Unsafe to use in noinstr code; use raw_atomic_long_inc_return_relaxed() there. 1501 + * 1502 + * Return: The updated value of @v. 1503 + */ 3620 1504 static __always_inline long 3621 1505 atomic_long_inc_return_relaxed(atomic_long_t *v) 3622 1506 { ··· 3634 1498 return raw_atomic_long_inc_return_relaxed(v); 3635 1499 } 3636 1500 1501 + /** 1502 + * atomic_long_fetch_inc() - atomic increment with full ordering 1503 + * @v: pointer to atomic_long_t 1504 + * 1505 + * Atomically updates @v to (@v + 1) with full ordering. 1506 + * 1507 + * Unsafe to use in noinstr code; use raw_atomic_long_fetch_inc() there. 1508 + * 1509 + * Return: The original value of @v. 1510 + */ 3637 1511 static __always_inline long 3638 1512 atomic_long_fetch_inc(atomic_long_t *v) 3639 1513 { ··· 3652 1506 return raw_atomic_long_fetch_inc(v); 3653 1507 } 3654 1508 1509 + /** 1510 + * atomic_long_fetch_inc_acquire() - atomic increment with acquire ordering 1511 + * @v: pointer to atomic_long_t 1512 + * 1513 + * Atomically updates @v to (@v + 1) with acquire ordering. 1514 + * 1515 + * Unsafe to use in noinstr code; use raw_atomic_long_fetch_inc_acquire() there. 1516 + * 1517 + * Return: The original value of @v. 1518 + */ 3655 1519 static __always_inline long 3656 1520 atomic_long_fetch_inc_acquire(atomic_long_t *v) 3657 1521 { ··· 3669 1513 return raw_atomic_long_fetch_inc_acquire(v); 3670 1514 } 3671 1515 1516 + /** 1517 + * atomic_long_fetch_inc_release() - atomic increment with release ordering 1518 + * @v: pointer to atomic_long_t 1519 + * 1520 + * Atomically updates @v to (@v + 1) with release ordering. 1521 + * 1522 + * Unsafe to use in noinstr code; use raw_atomic_long_fetch_inc_release() there. 1523 + * 1524 + * Return: The original value of @v. 1525 + */ 3672 1526 static __always_inline long 3673 1527 atomic_long_fetch_inc_release(atomic_long_t *v) 3674 1528 { ··· 3687 1521 return raw_atomic_long_fetch_inc_release(v); 3688 1522 } 3689 1523 1524 + /** 1525 + * atomic_long_fetch_inc_relaxed() - atomic increment with relaxed ordering 1526 + * @v: pointer to atomic_long_t 1527 + * 1528 + * Atomically updates @v to (@v + 1) with relaxed ordering. 1529 + * 1530 + * Unsafe to use in noinstr code; use raw_atomic_long_fetch_inc_relaxed() there. 1531 + * 1532 + * Return: The original value of @v. 1533 + */ 3690 1534 static __always_inline long 3691 1535 atomic_long_fetch_inc_relaxed(atomic_long_t *v) 3692 1536 { ··· 3704 1528 return raw_atomic_long_fetch_inc_relaxed(v); 3705 1529 } 3706 1530 1531 + /** 1532 + * atomic_long_dec() - atomic decrement with relaxed ordering 1533 + * @v: pointer to atomic_long_t 1534 + * 1535 + * Atomically updates @v to (@v - 1) with relaxed ordering. 1536 + * 1537 + * Unsafe to use in noinstr code; use raw_atomic_long_dec() there. 1538 + * 1539 + * Return: Nothing. 1540 + */ 3707 1541 static __always_inline void 3708 1542 atomic_long_dec(atomic_long_t *v) 3709 1543 { ··· 3721 1535 raw_atomic_long_dec(v); 3722 1536 } 3723 1537 1538 + /** 1539 + * atomic_long_dec_return() - atomic decrement with full ordering 1540 + * @v: pointer to atomic_long_t 1541 + * 1542 + * Atomically updates @v to (@v - 1) with full ordering. 1543 + * 1544 + * Unsafe to use in noinstr code; use raw_atomic_long_dec_return() there. 1545 + * 1546 + * Return: The updated value of @v. 1547 + */ 3724 1548 static __always_inline long 3725 1549 atomic_long_dec_return(atomic_long_t *v) 3726 1550 { ··· 3739 1543 return raw_atomic_long_dec_return(v); 3740 1544 } 3741 1545 1546 + /** 1547 + * atomic_long_dec_return_acquire() - atomic decrement with acquire ordering 1548 + * @v: pointer to atomic_long_t 1549 + * 1550 + * Atomically updates @v to (@v - 1) with acquire ordering. 1551 + * 1552 + * Unsafe to use in noinstr code; use raw_atomic_long_dec_return_acquire() there. 1553 + * 1554 + * Return: The updated value of @v. 1555 + */ 3742 1556 static __always_inline long 3743 1557 atomic_long_dec_return_acquire(atomic_long_t *v) 3744 1558 { ··· 3756 1550 return raw_atomic_long_dec_return_acquire(v); 3757 1551 } 3758 1552 1553 + /** 1554 + * atomic_long_dec_return_release() - atomic decrement with release ordering 1555 + * @v: pointer to atomic_long_t 1556 + * 1557 + * Atomically updates @v to (@v - 1) with release ordering. 1558 + * 1559 + * Unsafe to use in noinstr code; use raw_atomic_long_dec_return_release() there. 1560 + * 1561 + * Return: The updated value of @v. 1562 + */ 3759 1563 static __always_inline long 3760 1564 atomic_long_dec_return_release(atomic_long_t *v) 3761 1565 { ··· 3774 1558 return raw_atomic_long_dec_return_release(v); 3775 1559 } 3776 1560 1561 + /** 1562 + * atomic_long_dec_return_relaxed() - atomic decrement with relaxed ordering 1563 + * @v: pointer to atomic_long_t 1564 + * 1565 + * Atomically updates @v to (@v - 1) with relaxed ordering. 1566 + * 1567 + * Unsafe to use in noinstr code; use raw_atomic_long_dec_return_relaxed() there. 1568 + * 1569 + * Return: The updated value of @v. 1570 + */ 3777 1571 static __always_inline long 3778 1572 atomic_long_dec_return_relaxed(atomic_long_t *v) 3779 1573 { ··· 3791 1565 return raw_atomic_long_dec_return_relaxed(v); 3792 1566 } 3793 1567 1568 + /** 1569 + * atomic_long_fetch_dec() - atomic decrement with full ordering 1570 + * @v: pointer to atomic_long_t 1571 + * 1572 + * Atomically updates @v to (@v - 1) with full ordering. 1573 + * 1574 + * Unsafe to use in noinstr code; use raw_atomic_long_fetch_dec() there. 1575 + * 1576 + * Return: The original value of @v. 1577 + */ 3794 1578 static __always_inline long 3795 1579 atomic_long_fetch_dec(atomic_long_t *v) 3796 1580 { ··· 3809 1573 return raw_atomic_long_fetch_dec(v); 3810 1574 } 3811 1575 1576 + /** 1577 + * atomic_long_fetch_dec_acquire() - atomic decrement with acquire ordering 1578 + * @v: pointer to atomic_long_t 1579 + * 1580 + * Atomically updates @v to (@v - 1) with acquire ordering. 1581 + * 1582 + * Unsafe to use in noinstr code; use raw_atomic_long_fetch_dec_acquire() there. 1583 + * 1584 + * Return: The original value of @v. 1585 + */ 3812 1586 static __always_inline long 3813 1587 atomic_long_fetch_dec_acquire(atomic_long_t *v) 3814 1588 { ··· 3826 1580 return raw_atomic_long_fetch_dec_acquire(v); 3827 1581 } 3828 1582 1583 + /** 1584 + * atomic_long_fetch_dec_release() - atomic decrement with release ordering 1585 + * @v: pointer to atomic_long_t 1586 + * 1587 + * Atomically updates @v to (@v - 1) with release ordering. 1588 + * 1589 + * Unsafe to use in noinstr code; use raw_atomic_long_fetch_dec_release() there. 1590 + * 1591 + * Return: The original value of @v. 1592 + */ 3829 1593 static __always_inline long 3830 1594 atomic_long_fetch_dec_release(atomic_long_t *v) 3831 1595 { ··· 3844 1588 return raw_atomic_long_fetch_dec_release(v); 3845 1589 } 3846 1590 1591 + /** 1592 + * atomic_long_fetch_dec_relaxed() - atomic decrement with relaxed ordering 1593 + * @v: pointer to atomic_long_t 1594 + * 1595 + * Atomically updates @v to (@v - 1) with relaxed ordering. 1596 + * 1597 + * Unsafe to use in noinstr code; use raw_atomic_long_fetch_dec_relaxed() there. 1598 + * 1599 + * Return: The original value of @v. 1600 + */ 3847 1601 static __always_inline long 3848 1602 atomic_long_fetch_dec_relaxed(atomic_long_t *v) 3849 1603 { ··· 3861 1595 return raw_atomic_long_fetch_dec_relaxed(v); 3862 1596 } 3863 1597 1598 + /** 1599 + * atomic_long_and() - atomic bitwise AND with relaxed ordering 1600 + * @i: long value 1601 + * @v: pointer to atomic_long_t 1602 + * 1603 + * Atomically updates @v to (@v & @i) with relaxed ordering. 1604 + * 1605 + * Unsafe to use in noinstr code; use raw_atomic_long_and() there. 1606 + * 1607 + * Return: Nothing. 1608 + */ 3864 1609 static __always_inline void 3865 1610 atomic_long_and(long i, atomic_long_t *v) 3866 1611 { ··· 3879 1602 raw_atomic_long_and(i, v); 3880 1603 } 3881 1604 1605 + /** 1606 + * atomic_long_fetch_and() - atomic bitwise AND with full ordering 1607 + * @i: long value 1608 + * @v: pointer to atomic_long_t 1609 + * 1610 + * Atomically updates @v to (@v & @i) with full ordering. 1611 + * 1612 + * Unsafe to use in noinstr code; use raw_atomic_long_fetch_and() there. 1613 + * 1614 + * Return: The original value of @v. 1615 + */ 3882 1616 static __always_inline long 3883 1617 atomic_long_fetch_and(long i, atomic_long_t *v) 3884 1618 { ··· 3898 1610 return raw_atomic_long_fetch_and(i, v); 3899 1611 } 3900 1612 1613 + /** 1614 + * atomic_long_fetch_and_acquire() - atomic bitwise AND with acquire ordering 1615 + * @i: long value 1616 + * @v: pointer to atomic_long_t 1617 + * 1618 + * Atomically updates @v to (@v & @i) with acquire ordering. 1619 + * 1620 + * Unsafe to use in noinstr code; use raw_atomic_long_fetch_and_acquire() there. 1621 + * 1622 + * Return: The original value of @v. 1623 + */ 3901 1624 static __always_inline long 3902 1625 atomic_long_fetch_and_acquire(long i, atomic_long_t *v) 3903 1626 { ··· 3916 1617 return raw_atomic_long_fetch_and_acquire(i, v); 3917 1618 } 3918 1619 1620 + /** 1621 + * atomic_long_fetch_and_release() - atomic bitwise AND with release ordering 1622 + * @i: long value 1623 + * @v: pointer to atomic_long_t 1624 + * 1625 + * Atomically updates @v to (@v & @i) with release ordering. 1626 + * 1627 + * Unsafe to use in noinstr code; use raw_atomic_long_fetch_and_release() there. 1628 + * 1629 + * Return: The original value of @v. 1630 + */ 3919 1631 static __always_inline long 3920 1632 atomic_long_fetch_and_release(long i, atomic_long_t *v) 3921 1633 { ··· 3935 1625 return raw_atomic_long_fetch_and_release(i, v); 3936 1626 } 3937 1627 1628 + /** 1629 + * atomic_long_fetch_and_relaxed() - atomic bitwise AND with relaxed ordering 1630 + * @i: long value 1631 + * @v: pointer to atomic_long_t 1632 + * 1633 + * Atomically updates @v to (@v & @i) with relaxed ordering. 1634 + * 1635 + * Unsafe to use in noinstr code; use raw_atomic_long_fetch_and_relaxed() there. 1636 + * 1637 + * Return: The original value of @v. 1638 + */ 3938 1639 static __always_inline long 3939 1640 atomic_long_fetch_and_relaxed(long i, atomic_long_t *v) 3940 1641 { ··· 3953 1632 return raw_atomic_long_fetch_and_relaxed(i, v); 3954 1633 } 3955 1634 1635 + /** 1636 + * atomic_long_andnot() - atomic bitwise AND NOT with relaxed ordering 1637 + * @i: long value 1638 + * @v: pointer to atomic_long_t 1639 + * 1640 + * Atomically updates @v to (@v & ~@i) with relaxed ordering. 1641 + * 1642 + * Unsafe to use in noinstr code; use raw_atomic_long_andnot() there. 1643 + * 1644 + * Return: Nothing. 1645 + */ 3956 1646 static __always_inline void 3957 1647 atomic_long_andnot(long i, atomic_long_t *v) 3958 1648 { ··· 3971 1639 raw_atomic_long_andnot(i, v); 3972 1640 } 3973 1641 1642 + /** 1643 + * atomic_long_fetch_andnot() - atomic bitwise AND NOT with full ordering 1644 + * @i: long value 1645 + * @v: pointer to atomic_long_t 1646 + * 1647 + * Atomically updates @v to (@v & ~@i) with full ordering. 1648 + * 1649 + * Unsafe to use in noinstr code; use raw_atomic_long_fetch_andnot() there. 1650 + * 1651 + * Return: The original value of @v. 1652 + */ 3974 1653 static __always_inline long 3975 1654 atomic_long_fetch_andnot(long i, atomic_long_t *v) 3976 1655 { ··· 3990 1647 return raw_atomic_long_fetch_andnot(i, v); 3991 1648 } 3992 1649 1650 + /** 1651 + * atomic_long_fetch_andnot_acquire() - atomic bitwise AND NOT with acquire ordering 1652 + * @i: long value 1653 + * @v: pointer to atomic_long_t 1654 + * 1655 + * Atomically updates @v to (@v & ~@i) with acquire ordering. 1656 + * 1657 + * Unsafe to use in noinstr code; use raw_atomic_long_fetch_andnot_acquire() there. 1658 + * 1659 + * Return: The original value of @v. 1660 + */ 3993 1661 static __always_inline long 3994 1662 atomic_long_fetch_andnot_acquire(long i, atomic_long_t *v) 3995 1663 { ··· 4008 1654 return raw_atomic_long_fetch_andnot_acquire(i, v); 4009 1655 } 4010 1656 1657 + /** 1658 + * atomic_long_fetch_andnot_release() - atomic bitwise AND NOT with release ordering 1659 + * @i: long value 1660 + * @v: pointer to atomic_long_t 1661 + * 1662 + * Atomically updates @v to (@v & ~@i) with release ordering. 1663 + * 1664 + * Unsafe to use in noinstr code; use raw_atomic_long_fetch_andnot_release() there. 1665 + * 1666 + * Return: The original value of @v. 1667 + */ 4011 1668 static __always_inline long 4012 1669 atomic_long_fetch_andnot_release(long i, atomic_long_t *v) 4013 1670 { ··· 4027 1662 return raw_atomic_long_fetch_andnot_release(i, v); 4028 1663 } 4029 1664 1665 + /** 1666 + * atomic_long_fetch_andnot_relaxed() - atomic bitwise AND NOT with relaxed ordering 1667 + * @i: long value 1668 + * @v: pointer to atomic_long_t 1669 + * 1670 + * Atomically updates @v to (@v & ~@i) with relaxed ordering. 1671 + * 1672 + * Unsafe to use in noinstr code; use raw_atomic_long_fetch_andnot_relaxed() there. 1673 + * 1674 + * Return: The original value of @v. 1675 + */ 4030 1676 static __always_inline long 4031 1677 atomic_long_fetch_andnot_relaxed(long i, atomic_long_t *v) 4032 1678 { ··· 4045 1669 return raw_atomic_long_fetch_andnot_relaxed(i, v); 4046 1670 } 4047 1671 1672 + /** 1673 + * atomic_long_or() - atomic bitwise OR with relaxed ordering 1674 + * @i: long value 1675 + * @v: pointer to atomic_long_t 1676 + * 1677 + * Atomically updates @v to (@v | @i) with relaxed ordering. 1678 + * 1679 + * Unsafe to use in noinstr code; use raw_atomic_long_or() there. 1680 + * 1681 + * Return: Nothing. 1682 + */ 4048 1683 static __always_inline void 4049 1684 atomic_long_or(long i, atomic_long_t *v) 4050 1685 { ··· 4063 1676 raw_atomic_long_or(i, v); 4064 1677 } 4065 1678 1679 + /** 1680 + * atomic_long_fetch_or() - atomic bitwise OR with full ordering 1681 + * @i: long value 1682 + * @v: pointer to atomic_long_t 1683 + * 1684 + * Atomically updates @v to (@v | @i) with full ordering. 1685 + * 1686 + * Unsafe to use in noinstr code; use raw_atomic_long_fetch_or() there. 1687 + * 1688 + * Return: The original value of @v. 1689 + */ 4066 1690 static __always_inline long 4067 1691 atomic_long_fetch_or(long i, atomic_long_t *v) 4068 1692 { ··· 4082 1684 return raw_atomic_long_fetch_or(i, v); 4083 1685 } 4084 1686 1687 + /** 1688 + * atomic_long_fetch_or_acquire() - atomic bitwise OR with acquire ordering 1689 + * @i: long value 1690 + * @v: pointer to atomic_long_t 1691 + * 1692 + * Atomically updates @v to (@v | @i) with acquire ordering. 1693 + * 1694 + * Unsafe to use in noinstr code; use raw_atomic_long_fetch_or_acquire() there. 1695 + * 1696 + * Return: The original value of @v. 1697 + */ 4085 1698 static __always_inline long 4086 1699 atomic_long_fetch_or_acquire(long i, atomic_long_t *v) 4087 1700 { ··· 4100 1691 return raw_atomic_long_fetch_or_acquire(i, v); 4101 1692 } 4102 1693 1694 + /** 1695 + * atomic_long_fetch_or_release() - atomic bitwise OR with release ordering 1696 + * @i: long value 1697 + * @v: pointer to atomic_long_t 1698 + * 1699 + * Atomically updates @v to (@v | @i) with release ordering. 1700 + * 1701 + * Unsafe to use in noinstr code; use raw_atomic_long_fetch_or_release() there. 1702 + * 1703 + * Return: The original value of @v. 1704 + */ 4103 1705 static __always_inline long 4104 1706 atomic_long_fetch_or_release(long i, atomic_long_t *v) 4105 1707 { ··· 4119 1699 return raw_atomic_long_fetch_or_release(i, v); 4120 1700 } 4121 1701 1702 + /** 1703 + * atomic_long_fetch_or_relaxed() - atomic bitwise OR with relaxed ordering 1704 + * @i: long value 1705 + * @v: pointer to atomic_long_t 1706 + * 1707 + * Atomically updates @v to (@v | @i) with relaxed ordering. 1708 + * 1709 + * Unsafe to use in noinstr code; use raw_atomic_long_fetch_or_relaxed() there. 1710 + * 1711 + * Return: The original value of @v. 1712 + */ 4122 1713 static __always_inline long 4123 1714 atomic_long_fetch_or_relaxed(long i, atomic_long_t *v) 4124 1715 { ··· 4137 1706 return raw_atomic_long_fetch_or_relaxed(i, v); 4138 1707 } 4139 1708 1709 + /** 1710 + * atomic_long_xor() - atomic bitwise XOR with relaxed ordering 1711 + * @i: long value 1712 + * @v: pointer to atomic_long_t 1713 + * 1714 + * Atomically updates @v to (@v ^ @i) with relaxed ordering. 1715 + * 1716 + * Unsafe to use in noinstr code; use raw_atomic_long_xor() there. 1717 + * 1718 + * Return: Nothing. 1719 + */ 4140 1720 static __always_inline void 4141 1721 atomic_long_xor(long i, atomic_long_t *v) 4142 1722 { ··· 4155 1713 raw_atomic_long_xor(i, v); 4156 1714 } 4157 1715 1716 + /** 1717 + * atomic_long_fetch_xor() - atomic bitwise XOR with full ordering 1718 + * @i: long value 1719 + * @v: pointer to atomic_long_t 1720 + * 1721 + * Atomically updates @v to (@v ^ @i) with full ordering. 1722 + * 1723 + * Unsafe to use in noinstr code; use raw_atomic_long_fetch_xor() there. 1724 + * 1725 + * Return: The original value of @v. 1726 + */ 4158 1727 static __always_inline long 4159 1728 atomic_long_fetch_xor(long i, atomic_long_t *v) 4160 1729 { ··· 4174 1721 return raw_atomic_long_fetch_xor(i, v); 4175 1722 } 4176 1723 1724 + /** 1725 + * atomic_long_fetch_xor_acquire() - atomic bitwise XOR with acquire ordering 1726 + * @i: long value 1727 + * @v: pointer to atomic_long_t 1728 + * 1729 + * Atomically updates @v to (@v ^ @i) with acquire ordering. 1730 + * 1731 + * Unsafe to use in noinstr code; use raw_atomic_long_fetch_xor_acquire() there. 1732 + * 1733 + * Return: The original value of @v. 1734 + */ 4177 1735 static __always_inline long 4178 1736 atomic_long_fetch_xor_acquire(long i, atomic_long_t *v) 4179 1737 { ··· 4192 1728 return raw_atomic_long_fetch_xor_acquire(i, v); 4193 1729 } 4194 1730 1731 + /** 1732 + * atomic_long_fetch_xor_release() - atomic bitwise XOR with release ordering 1733 + * @i: long value 1734 + * @v: pointer to atomic_long_t 1735 + * 1736 + * Atomically updates @v to (@v ^ @i) with release ordering. 1737 + * 1738 + * Unsafe to use in noinstr code; use raw_atomic_long_fetch_xor_release() there. 1739 + * 1740 + * Return: The original value of @v. 1741 + */ 4195 1742 static __always_inline long 4196 1743 atomic_long_fetch_xor_release(long i, atomic_long_t *v) 4197 1744 { ··· 4211 1736 return raw_atomic_long_fetch_xor_release(i, v); 4212 1737 } 4213 1738 1739 + /** 1740 + * atomic_long_fetch_xor_relaxed() - atomic bitwise XOR with relaxed ordering 1741 + * @i: long value 1742 + * @v: pointer to atomic_long_t 1743 + * 1744 + * Atomically updates @v to (@v ^ @i) with relaxed ordering. 1745 + * 1746 + * Unsafe to use in noinstr code; use raw_atomic_long_fetch_xor_relaxed() there. 1747 + * 1748 + * Return: The original value of @v. 1749 + */ 4214 1750 static __always_inline long 4215 1751 atomic_long_fetch_xor_relaxed(long i, atomic_long_t *v) 4216 1752 { ··· 4229 1743 return raw_atomic_long_fetch_xor_relaxed(i, v); 4230 1744 } 4231 1745 1746 + /** 1747 + * atomic_long_xchg() - atomic exchange with full ordering 1748 + * @v: pointer to atomic_long_t 1749 + * @new: long value to assign 1750 + * 1751 + * Atomically updates @v to @new with full ordering. 1752 + * 1753 + * Unsafe to use in noinstr code; use raw_atomic_long_xchg() there. 1754 + * 1755 + * Return: The original value of @v. 1756 + */ 4232 1757 static __always_inline long 4233 1758 atomic_long_xchg(atomic_long_t *v, long new) 4234 1759 { ··· 4248 1751 return raw_atomic_long_xchg(v, new); 4249 1752 } 4250 1753 1754 + /** 1755 + * atomic_long_xchg_acquire() - atomic exchange with acquire ordering 1756 + * @v: pointer to atomic_long_t 1757 + * @new: long value to assign 1758 + * 1759 + * Atomically updates @v to @new with acquire ordering. 1760 + * 1761 + * Unsafe to use in noinstr code; use raw_atomic_long_xchg_acquire() there. 1762 + * 1763 + * Return: The original value of @v. 1764 + */ 4251 1765 static __always_inline long 4252 1766 atomic_long_xchg_acquire(atomic_long_t *v, long new) 4253 1767 { ··· 4266 1758 return raw_atomic_long_xchg_acquire(v, new); 4267 1759 } 4268 1760 1761 + /** 1762 + * atomic_long_xchg_release() - atomic exchange with release ordering 1763 + * @v: pointer to atomic_long_t 1764 + * @new: long value to assign 1765 + * 1766 + * Atomically updates @v to @new with release ordering. 1767 + * 1768 + * Unsafe to use in noinstr code; use raw_atomic_long_xchg_release() there. 1769 + * 1770 + * Return: The original value of @v. 1771 + */ 4269 1772 static __always_inline long 4270 1773 atomic_long_xchg_release(atomic_long_t *v, long new) 4271 1774 { ··· 4285 1766 return raw_atomic_long_xchg_release(v, new); 4286 1767 } 4287 1768 1769 + /** 1770 + * atomic_long_xchg_relaxed() - atomic exchange with relaxed ordering 1771 + * @v: pointer to atomic_long_t 1772 + * @new: long value to assign 1773 + * 1774 + * Atomically updates @v to @new with relaxed ordering. 1775 + * 1776 + * Unsafe to use in noinstr code; use raw_atomic_long_xchg_relaxed() there. 1777 + * 1778 + * Return: The original value of @v. 1779 + */ 4288 1780 static __always_inline long 4289 1781 atomic_long_xchg_relaxed(atomic_long_t *v, long new) 4290 1782 { ··· 4303 1773 return raw_atomic_long_xchg_relaxed(v, new); 4304 1774 } 4305 1775 1776 + /** 1777 + * atomic_long_cmpxchg() - atomic compare and exchange with full ordering 1778 + * @v: pointer to atomic_long_t 1779 + * @old: long value to compare with 1780 + * @new: long value to assign 1781 + * 1782 + * If (@v == @old), atomically updates @v to @new with full ordering. 1783 + * 1784 + * Unsafe to use in noinstr code; use raw_atomic_long_cmpxchg() there. 1785 + * 1786 + * Return: The original value of @v. 1787 + */ 4306 1788 static __always_inline long 4307 1789 atomic_long_cmpxchg(atomic_long_t *v, long old, long new) 4308 1790 { ··· 4323 1781 return raw_atomic_long_cmpxchg(v, old, new); 4324 1782 } 4325 1783 1784 + /** 1785 + * atomic_long_cmpxchg_acquire() - atomic compare and exchange with acquire ordering 1786 + * @v: pointer to atomic_long_t 1787 + * @old: long value to compare with 1788 + * @new: long value to assign 1789 + * 1790 + * If (@v == @old), atomically updates @v to @new with acquire ordering. 1791 + * 1792 + * Unsafe to use in noinstr code; use raw_atomic_long_cmpxchg_acquire() there. 1793 + * 1794 + * Return: The original value of @v. 1795 + */ 4326 1796 static __always_inline long 4327 1797 atomic_long_cmpxchg_acquire(atomic_long_t *v, long old, long new) 4328 1798 { ··· 4342 1788 return raw_atomic_long_cmpxchg_acquire(v, old, new); 4343 1789 } 4344 1790 1791 + /** 1792 + * atomic_long_cmpxchg_release() - atomic compare and exchange with release ordering 1793 + * @v: pointer to atomic_long_t 1794 + * @old: long value to compare with 1795 + * @new: long value to assign 1796 + * 1797 + * If (@v == @old), atomically updates @v to @new with release ordering. 1798 + * 1799 + * Unsafe to use in noinstr code; use raw_atomic_long_cmpxchg_release() there. 1800 + * 1801 + * Return: The original value of @v. 1802 + */ 4345 1803 static __always_inline long 4346 1804 atomic_long_cmpxchg_release(atomic_long_t *v, long old, long new) 4347 1805 { ··· 4362 1796 return raw_atomic_long_cmpxchg_release(v, old, new); 4363 1797 } 4364 1798 1799 + /** 1800 + * atomic_long_cmpxchg_relaxed() - atomic compare and exchange with relaxed ordering 1801 + * @v: pointer to atomic_long_t 1802 + * @old: long value to compare with 1803 + * @new: long value to assign 1804 + * 1805 + * If (@v == @old), atomically updates @v to @new with relaxed ordering. 1806 + * 1807 + * Unsafe to use in noinstr code; use raw_atomic_long_cmpxchg_relaxed() there. 1808 + * 1809 + * Return: The original value of @v. 1810 + */ 4365 1811 static __always_inline long 4366 1812 atomic_long_cmpxchg_relaxed(atomic_long_t *v, long old, long new) 4367 1813 { ··· 4381 1803 return raw_atomic_long_cmpxchg_relaxed(v, old, new); 4382 1804 } 4383 1805 1806 + /** 1807 + * atomic_long_try_cmpxchg() - atomic compare and exchange with full ordering 1808 + * @v: pointer to atomic_long_t 1809 + * @old: pointer to long value to compare with 1810 + * @new: long value to assign 1811 + * 1812 + * If (@v == @old), atomically updates @v to @new with full ordering. 1813 + * Otherwise, updates @old to the current value of @v. 1814 + * 1815 + * Unsafe to use in noinstr code; use raw_atomic_long_try_cmpxchg() there. 1816 + * 1817 + * Return: @true if the exchange occured, @false otherwise. 1818 + */ 4384 1819 static __always_inline bool 4385 1820 atomic_long_try_cmpxchg(atomic_long_t *v, long *old, long new) 4386 1821 { ··· 4403 1812 return raw_atomic_long_try_cmpxchg(v, old, new); 4404 1813 } 4405 1814 1815 + /** 1816 + * atomic_long_try_cmpxchg_acquire() - atomic compare and exchange with acquire ordering 1817 + * @v: pointer to atomic_long_t 1818 + * @old: pointer to long value to compare with 1819 + * @new: long value to assign 1820 + * 1821 + * If (@v == @old), atomically updates @v to @new with acquire ordering. 1822 + * Otherwise, updates @old to the current value of @v. 1823 + * 1824 + * Unsafe to use in noinstr code; use raw_atomic_long_try_cmpxchg_acquire() there. 1825 + * 1826 + * Return: @true if the exchange occured, @false otherwise. 1827 + */ 4406 1828 static __always_inline bool 4407 1829 atomic_long_try_cmpxchg_acquire(atomic_long_t *v, long *old, long new) 4408 1830 { ··· 4424 1820 return raw_atomic_long_try_cmpxchg_acquire(v, old, new); 4425 1821 } 4426 1822 1823 + /** 1824 + * atomic_long_try_cmpxchg_release() - atomic compare and exchange with release ordering 1825 + * @v: pointer to atomic_long_t 1826 + * @old: pointer to long value to compare with 1827 + * @new: long value to assign 1828 + * 1829 + * If (@v == @old), atomically updates @v to @new with release ordering. 1830 + * Otherwise, updates @old to the current value of @v. 1831 + * 1832 + * Unsafe to use in noinstr code; use raw_atomic_long_try_cmpxchg_release() there. 1833 + * 1834 + * Return: @true if the exchange occured, @false otherwise. 1835 + */ 4427 1836 static __always_inline bool 4428 1837 atomic_long_try_cmpxchg_release(atomic_long_t *v, long *old, long new) 4429 1838 { ··· 4446 1829 return raw_atomic_long_try_cmpxchg_release(v, old, new); 4447 1830 } 4448 1831 1832 + /** 1833 + * atomic_long_try_cmpxchg_relaxed() - atomic compare and exchange with relaxed ordering 1834 + * @v: pointer to atomic_long_t 1835 + * @old: pointer to long value to compare with 1836 + * @new: long value to assign 1837 + * 1838 + * If (@v == @old), atomically updates @v to @new with relaxed ordering. 1839 + * Otherwise, updates @old to the current value of @v. 1840 + * 1841 + * Unsafe to use in noinstr code; use raw_atomic_long_try_cmpxchg_relaxed() there. 1842 + * 1843 + * Return: @true if the exchange occured, @false otherwise. 1844 + */ 4449 1845 static __always_inline bool 4450 1846 atomic_long_try_cmpxchg_relaxed(atomic_long_t *v, long *old, long new) 4451 1847 { ··· 4467 1837 return raw_atomic_long_try_cmpxchg_relaxed(v, old, new); 4468 1838 } 4469 1839 1840 + /** 1841 + * atomic_long_sub_and_test() - atomic subtract and test if zero with full ordering 1842 + * @i: long value to add 1843 + * @v: pointer to atomic_long_t 1844 + * 1845 + * Atomically updates @v to (@v - @i) with full ordering. 1846 + * 1847 + * Unsafe to use in noinstr code; use raw_atomic_long_sub_and_test() there. 1848 + * 1849 + * Return: @true if the resulting value of @v is zero, @false otherwise. 1850 + */ 4470 1851 static __always_inline bool 4471 1852 atomic_long_sub_and_test(long i, atomic_long_t *v) 4472 1853 { ··· 4486 1845 return raw_atomic_long_sub_and_test(i, v); 4487 1846 } 4488 1847 1848 + /** 1849 + * atomic_long_dec_and_test() - atomic decrement and test if zero with full ordering 1850 + * @v: pointer to atomic_long_t 1851 + * 1852 + * Atomically updates @v to (@v - 1) with full ordering. 1853 + * 1854 + * Unsafe to use in noinstr code; use raw_atomic_long_dec_and_test() there. 1855 + * 1856 + * Return: @true if the resulting value of @v is zero, @false otherwise. 1857 + */ 4489 1858 static __always_inline bool 4490 1859 atomic_long_dec_and_test(atomic_long_t *v) 4491 1860 { ··· 4504 1853 return raw_atomic_long_dec_and_test(v); 4505 1854 } 4506 1855 1856 + /** 1857 + * atomic_long_inc_and_test() - atomic increment and test if zero with full ordering 1858 + * @v: pointer to atomic_long_t 1859 + * 1860 + * Atomically updates @v to (@v + 1) with full ordering. 1861 + * 1862 + * Unsafe to use in noinstr code; use raw_atomic_long_inc_and_test() there. 1863 + * 1864 + * Return: @true if the resulting value of @v is zero, @false otherwise. 1865 + */ 4507 1866 static __always_inline bool 4508 1867 atomic_long_inc_and_test(atomic_long_t *v) 4509 1868 { ··· 4522 1861 return raw_atomic_long_inc_and_test(v); 4523 1862 } 4524 1863 1864 + /** 1865 + * atomic_long_add_negative() - atomic add and test if negative with full ordering 1866 + * @i: long value to add 1867 + * @v: pointer to atomic_long_t 1868 + * 1869 + * Atomically updates @v to (@v + @i) with full ordering. 1870 + * 1871 + * Unsafe to use in noinstr code; use raw_atomic_long_add_negative() there. 1872 + * 1873 + * Return: @true if the resulting value of @v is negative, @false otherwise. 1874 + */ 4525 1875 static __always_inline bool 4526 1876 atomic_long_add_negative(long i, atomic_long_t *v) 4527 1877 { ··· 4541 1869 return raw_atomic_long_add_negative(i, v); 4542 1870 } 4543 1871 1872 + /** 1873 + * atomic_long_add_negative_acquire() - atomic add and test if negative with acquire ordering 1874 + * @i: long value to add 1875 + * @v: pointer to atomic_long_t 1876 + * 1877 + * Atomically updates @v to (@v + @i) with acquire ordering. 1878 + * 1879 + * Unsafe to use in noinstr code; use raw_atomic_long_add_negative_acquire() there. 1880 + * 1881 + * Return: @true if the resulting value of @v is negative, @false otherwise. 1882 + */ 4544 1883 static __always_inline bool 4545 1884 atomic_long_add_negative_acquire(long i, atomic_long_t *v) 4546 1885 { ··· 4559 1876 return raw_atomic_long_add_negative_acquire(i, v); 4560 1877 } 4561 1878 1879 + /** 1880 + * atomic_long_add_negative_release() - atomic add and test if negative with release ordering 1881 + * @i: long value to add 1882 + * @v: pointer to atomic_long_t 1883 + * 1884 + * Atomically updates @v to (@v + @i) with release ordering. 1885 + * 1886 + * Unsafe to use in noinstr code; use raw_atomic_long_add_negative_release() there. 1887 + * 1888 + * Return: @true if the resulting value of @v is negative, @false otherwise. 1889 + */ 4562 1890 static __always_inline bool 4563 1891 atomic_long_add_negative_release(long i, atomic_long_t *v) 4564 1892 { ··· 4578 1884 return raw_atomic_long_add_negative_release(i, v); 4579 1885 } 4580 1886 1887 + /** 1888 + * atomic_long_add_negative_relaxed() - atomic add and test if negative with relaxed ordering 1889 + * @i: long value to add 1890 + * @v: pointer to atomic_long_t 1891 + * 1892 + * Atomically updates @v to (@v + @i) with relaxed ordering. 1893 + * 1894 + * Unsafe to use in noinstr code; use raw_atomic_long_add_negative_relaxed() there. 1895 + * 1896 + * Return: @true if the resulting value of @v is negative, @false otherwise. 1897 + */ 4581 1898 static __always_inline bool 4582 1899 atomic_long_add_negative_relaxed(long i, atomic_long_t *v) 4583 1900 { ··· 4596 1891 return raw_atomic_long_add_negative_relaxed(i, v); 4597 1892 } 4598 1893 1894 + /** 1895 + * atomic_long_fetch_add_unless() - atomic add unless value with full ordering 1896 + * @v: pointer to atomic_long_t 1897 + * @a: long value to add 1898 + * @u: long value to compare with 1899 + * 1900 + * If (@v != @u), atomically updates @v to (@v + @a) with full ordering. 1901 + * 1902 + * Unsafe to use in noinstr code; use raw_atomic_long_fetch_add_unless() there. 1903 + * 1904 + * Return: The original value of @v. 1905 + */ 4599 1906 static __always_inline long 4600 1907 atomic_long_fetch_add_unless(atomic_long_t *v, long a, long u) 4601 1908 { ··· 4616 1899 return raw_atomic_long_fetch_add_unless(v, a, u); 4617 1900 } 4618 1901 1902 + /** 1903 + * atomic_long_add_unless() - atomic add unless value with full ordering 1904 + * @v: pointer to atomic_long_t 1905 + * @a: long value to add 1906 + * @u: long value to compare with 1907 + * 1908 + * If (@v != @u), atomically updates @v to (@v + @a) with full ordering. 1909 + * 1910 + * Unsafe to use in noinstr code; use raw_atomic_long_add_unless() there. 1911 + * 1912 + * Return: @true if @v was updated, @false otherwise. 1913 + */ 4619 1914 static __always_inline bool 4620 1915 atomic_long_add_unless(atomic_long_t *v, long a, long u) 4621 1916 { ··· 4636 1907 return raw_atomic_long_add_unless(v, a, u); 4637 1908 } 4638 1909 1910 + /** 1911 + * atomic_long_inc_not_zero() - atomic increment unless zero with full ordering 1912 + * @v: pointer to atomic_long_t 1913 + * 1914 + * If (@v != 0), atomically updates @v to (@v + 1) with full ordering. 1915 + * 1916 + * Unsafe to use in noinstr code; use raw_atomic_long_inc_not_zero() there. 1917 + * 1918 + * Return: @true if @v was updated, @false otherwise. 1919 + */ 4639 1920 static __always_inline bool 4640 1921 atomic_long_inc_not_zero(atomic_long_t *v) 4641 1922 { ··· 4654 1915 return raw_atomic_long_inc_not_zero(v); 4655 1916 } 4656 1917 1918 + /** 1919 + * atomic_long_inc_unless_negative() - atomic increment unless negative with full ordering 1920 + * @v: pointer to atomic_long_t 1921 + * 1922 + * If (@v >= 0), atomically updates @v to (@v + 1) with full ordering. 1923 + * 1924 + * Unsafe to use in noinstr code; use raw_atomic_long_inc_unless_negative() there. 1925 + * 1926 + * Return: @true if @v was updated, @false otherwise. 1927 + */ 4657 1928 static __always_inline bool 4658 1929 atomic_long_inc_unless_negative(atomic_long_t *v) 4659 1930 { ··· 4672 1923 return raw_atomic_long_inc_unless_negative(v); 4673 1924 } 4674 1925 1926 + /** 1927 + * atomic_long_dec_unless_positive() - atomic decrement unless positive with full ordering 1928 + * @v: pointer to atomic_long_t 1929 + * 1930 + * If (@v <= 0), atomically updates @v to (@v - 1) with full ordering. 1931 + * 1932 + * Unsafe to use in noinstr code; use raw_atomic_long_dec_unless_positive() there. 1933 + * 1934 + * Return: @true if @v was updated, @false otherwise. 1935 + */ 4675 1936 static __always_inline bool 4676 1937 atomic_long_dec_unless_positive(atomic_long_t *v) 4677 1938 { ··· 4690 1931 return raw_atomic_long_dec_unless_positive(v); 4691 1932 } 4692 1933 1934 + /** 1935 + * atomic_long_dec_if_positive() - atomic decrement if positive with full ordering 1936 + * @v: pointer to atomic_long_t 1937 + * 1938 + * If (@v > 0), atomically updates @v to (@v - 1) with full ordering. 1939 + * 1940 + * Unsafe to use in noinstr code; use raw_atomic_long_dec_if_positive() there. 1941 + * 1942 + * Return: @true if @v was updated, @false otherwise. 1943 + */ 4693 1944 static __always_inline long 4694 1945 atomic_long_dec_if_positive(atomic_long_t *v) 4695 1946 { ··· 5000 2231 5001 2232 5002 2233 #endif /* _LINUX_ATOMIC_INSTRUMENTED_H */ 5003 - // a4c3d2b229f907654cc53cb5d40e80f7fed1ec9c 2234 + // 06cec02e676a484857aee38b0071a1d846ec9457
+924 -1
include/linux/atomic/atomic-long.h
··· 21 21 #define atomic_long_cond_read_relaxed atomic_cond_read_relaxed 22 22 #endif 23 23 24 + /** 25 + * raw_atomic_long_read() - atomic load with relaxed ordering 26 + * @v: pointer to atomic_long_t 27 + * 28 + * Atomically loads the value of @v with relaxed ordering. 29 + * 30 + * Safe to use in noinstr code; prefer atomic_long_read() elsewhere. 31 + * 32 + * Return: The value loaded from @v. 33 + */ 24 34 static __always_inline long 25 35 raw_atomic_long_read(const atomic_long_t *v) 26 36 { ··· 41 31 #endif 42 32 } 43 33 34 + /** 35 + * raw_atomic_long_read_acquire() - atomic load with acquire ordering 36 + * @v: pointer to atomic_long_t 37 + * 38 + * Atomically loads the value of @v with acquire ordering. 39 + * 40 + * Safe to use in noinstr code; prefer atomic_long_read_acquire() elsewhere. 41 + * 42 + * Return: The value loaded from @v. 43 + */ 44 44 static __always_inline long 45 45 raw_atomic_long_read_acquire(const atomic_long_t *v) 46 46 { ··· 61 41 #endif 62 42 } 63 43 44 + /** 45 + * raw_atomic_long_set() - atomic set with relaxed ordering 46 + * @v: pointer to atomic_long_t 47 + * @i: long value to assign 48 + * 49 + * Atomically sets @v to @i with relaxed ordering. 50 + * 51 + * Safe to use in noinstr code; prefer atomic_long_set() elsewhere. 52 + * 53 + * Return: Nothing. 54 + */ 64 55 static __always_inline void 65 56 raw_atomic_long_set(atomic_long_t *v, long i) 66 57 { ··· 82 51 #endif 83 52 } 84 53 54 + /** 55 + * raw_atomic_long_set_release() - atomic set with release ordering 56 + * @v: pointer to atomic_long_t 57 + * @i: long value to assign 58 + * 59 + * Atomically sets @v to @i with release ordering. 60 + * 61 + * Safe to use in noinstr code; prefer atomic_long_set_release() elsewhere. 62 + * 63 + * Return: Nothing. 64 + */ 85 65 static __always_inline void 86 66 raw_atomic_long_set_release(atomic_long_t *v, long i) 87 67 { ··· 103 61 #endif 104 62 } 105 63 64 + /** 65 + * raw_atomic_long_add() - atomic add with relaxed ordering 66 + * @i: long value to add 67 + * @v: pointer to atomic_long_t 68 + * 69 + * Atomically updates @v to (@v + @i) with relaxed ordering. 70 + * 71 + * Safe to use in noinstr code; prefer atomic_long_add() elsewhere. 72 + * 73 + * Return: Nothing. 74 + */ 106 75 static __always_inline void 107 76 raw_atomic_long_add(long i, atomic_long_t *v) 108 77 { ··· 124 71 #endif 125 72 } 126 73 74 + /** 75 + * raw_atomic_long_add_return() - atomic add with full ordering 76 + * @i: long value to add 77 + * @v: pointer to atomic_long_t 78 + * 79 + * Atomically updates @v to (@v + @i) with full ordering. 80 + * 81 + * Safe to use in noinstr code; prefer atomic_long_add_return() elsewhere. 82 + * 83 + * Return: The updated value of @v. 84 + */ 127 85 static __always_inline long 128 86 raw_atomic_long_add_return(long i, atomic_long_t *v) 129 87 { ··· 145 81 #endif 146 82 } 147 83 84 + /** 85 + * raw_atomic_long_add_return_acquire() - atomic add with acquire ordering 86 + * @i: long value to add 87 + * @v: pointer to atomic_long_t 88 + * 89 + * Atomically updates @v to (@v + @i) with acquire ordering. 90 + * 91 + * Safe to use in noinstr code; prefer atomic_long_add_return_acquire() elsewhere. 92 + * 93 + * Return: The updated value of @v. 94 + */ 148 95 static __always_inline long 149 96 raw_atomic_long_add_return_acquire(long i, atomic_long_t *v) 150 97 { ··· 166 91 #endif 167 92 } 168 93 94 + /** 95 + * raw_atomic_long_add_return_release() - atomic add with release ordering 96 + * @i: long value to add 97 + * @v: pointer to atomic_long_t 98 + * 99 + * Atomically updates @v to (@v + @i) with release ordering. 100 + * 101 + * Safe to use in noinstr code; prefer atomic_long_add_return_release() elsewhere. 102 + * 103 + * Return: The updated value of @v. 104 + */ 169 105 static __always_inline long 170 106 raw_atomic_long_add_return_release(long i, atomic_long_t *v) 171 107 { ··· 187 101 #endif 188 102 } 189 103 104 + /** 105 + * raw_atomic_long_add_return_relaxed() - atomic add with relaxed ordering 106 + * @i: long value to add 107 + * @v: pointer to atomic_long_t 108 + * 109 + * Atomically updates @v to (@v + @i) with relaxed ordering. 110 + * 111 + * Safe to use in noinstr code; prefer atomic_long_add_return_relaxed() elsewhere. 112 + * 113 + * Return: The updated value of @v. 114 + */ 190 115 static __always_inline long 191 116 raw_atomic_long_add_return_relaxed(long i, atomic_long_t *v) 192 117 { ··· 208 111 #endif 209 112 } 210 113 114 + /** 115 + * raw_atomic_long_fetch_add() - atomic add with full ordering 116 + * @i: long value to add 117 + * @v: pointer to atomic_long_t 118 + * 119 + * Atomically updates @v to (@v + @i) with full ordering. 120 + * 121 + * Safe to use in noinstr code; prefer atomic_long_fetch_add() elsewhere. 122 + * 123 + * Return: The original value of @v. 124 + */ 211 125 static __always_inline long 212 126 raw_atomic_long_fetch_add(long i, atomic_long_t *v) 213 127 { ··· 229 121 #endif 230 122 } 231 123 124 + /** 125 + * raw_atomic_long_fetch_add_acquire() - atomic add with acquire ordering 126 + * @i: long value to add 127 + * @v: pointer to atomic_long_t 128 + * 129 + * Atomically updates @v to (@v + @i) with acquire ordering. 130 + * 131 + * Safe to use in noinstr code; prefer atomic_long_fetch_add_acquire() elsewhere. 132 + * 133 + * Return: The original value of @v. 134 + */ 232 135 static __always_inline long 233 136 raw_atomic_long_fetch_add_acquire(long i, atomic_long_t *v) 234 137 { ··· 250 131 #endif 251 132 } 252 133 134 + /** 135 + * raw_atomic_long_fetch_add_release() - atomic add with release ordering 136 + * @i: long value to add 137 + * @v: pointer to atomic_long_t 138 + * 139 + * Atomically updates @v to (@v + @i) with release ordering. 140 + * 141 + * Safe to use in noinstr code; prefer atomic_long_fetch_add_release() elsewhere. 142 + * 143 + * Return: The original value of @v. 144 + */ 253 145 static __always_inline long 254 146 raw_atomic_long_fetch_add_release(long i, atomic_long_t *v) 255 147 { ··· 271 141 #endif 272 142 } 273 143 144 + /** 145 + * raw_atomic_long_fetch_add_relaxed() - atomic add with relaxed ordering 146 + * @i: long value to add 147 + * @v: pointer to atomic_long_t 148 + * 149 + * Atomically updates @v to (@v + @i) with relaxed ordering. 150 + * 151 + * Safe to use in noinstr code; prefer atomic_long_fetch_add_relaxed() elsewhere. 152 + * 153 + * Return: The original value of @v. 154 + */ 274 155 static __always_inline long 275 156 raw_atomic_long_fetch_add_relaxed(long i, atomic_long_t *v) 276 157 { ··· 292 151 #endif 293 152 } 294 153 154 + /** 155 + * raw_atomic_long_sub() - atomic subtract with relaxed ordering 156 + * @i: long value to subtract 157 + * @v: pointer to atomic_long_t 158 + * 159 + * Atomically updates @v to (@v - @i) with relaxed ordering. 160 + * 161 + * Safe to use in noinstr code; prefer atomic_long_sub() elsewhere. 162 + * 163 + * Return: Nothing. 164 + */ 295 165 static __always_inline void 296 166 raw_atomic_long_sub(long i, atomic_long_t *v) 297 167 { ··· 313 161 #endif 314 162 } 315 163 164 + /** 165 + * raw_atomic_long_sub_return() - atomic subtract with full ordering 166 + * @i: long value to subtract 167 + * @v: pointer to atomic_long_t 168 + * 169 + * Atomically updates @v to (@v - @i) with full ordering. 170 + * 171 + * Safe to use in noinstr code; prefer atomic_long_sub_return() elsewhere. 172 + * 173 + * Return: The updated value of @v. 174 + */ 316 175 static __always_inline long 317 176 raw_atomic_long_sub_return(long i, atomic_long_t *v) 318 177 { ··· 334 171 #endif 335 172 } 336 173 174 + /** 175 + * raw_atomic_long_sub_return_acquire() - atomic subtract with acquire ordering 176 + * @i: long value to subtract 177 + * @v: pointer to atomic_long_t 178 + * 179 + * Atomically updates @v to (@v - @i) with acquire ordering. 180 + * 181 + * Safe to use in noinstr code; prefer atomic_long_sub_return_acquire() elsewhere. 182 + * 183 + * Return: The updated value of @v. 184 + */ 337 185 static __always_inline long 338 186 raw_atomic_long_sub_return_acquire(long i, atomic_long_t *v) 339 187 { ··· 355 181 #endif 356 182 } 357 183 184 + /** 185 + * raw_atomic_long_sub_return_release() - atomic subtract with release ordering 186 + * @i: long value to subtract 187 + * @v: pointer to atomic_long_t 188 + * 189 + * Atomically updates @v to (@v - @i) with release ordering. 190 + * 191 + * Safe to use in noinstr code; prefer atomic_long_sub_return_release() elsewhere. 192 + * 193 + * Return: The updated value of @v. 194 + */ 358 195 static __always_inline long 359 196 raw_atomic_long_sub_return_release(long i, atomic_long_t *v) 360 197 { ··· 376 191 #endif 377 192 } 378 193 194 + /** 195 + * raw_atomic_long_sub_return_relaxed() - atomic subtract with relaxed ordering 196 + * @i: long value to subtract 197 + * @v: pointer to atomic_long_t 198 + * 199 + * Atomically updates @v to (@v - @i) with relaxed ordering. 200 + * 201 + * Safe to use in noinstr code; prefer atomic_long_sub_return_relaxed() elsewhere. 202 + * 203 + * Return: The updated value of @v. 204 + */ 379 205 static __always_inline long 380 206 raw_atomic_long_sub_return_relaxed(long i, atomic_long_t *v) 381 207 { ··· 397 201 #endif 398 202 } 399 203 204 + /** 205 + * raw_atomic_long_fetch_sub() - atomic subtract with full ordering 206 + * @i: long value to subtract 207 + * @v: pointer to atomic_long_t 208 + * 209 + * Atomically updates @v to (@v - @i) with full ordering. 210 + * 211 + * Safe to use in noinstr code; prefer atomic_long_fetch_sub() elsewhere. 212 + * 213 + * Return: The original value of @v. 214 + */ 400 215 static __always_inline long 401 216 raw_atomic_long_fetch_sub(long i, atomic_long_t *v) 402 217 { ··· 418 211 #endif 419 212 } 420 213 214 + /** 215 + * raw_atomic_long_fetch_sub_acquire() - atomic subtract with acquire ordering 216 + * @i: long value to subtract 217 + * @v: pointer to atomic_long_t 218 + * 219 + * Atomically updates @v to (@v - @i) with acquire ordering. 220 + * 221 + * Safe to use in noinstr code; prefer atomic_long_fetch_sub_acquire() elsewhere. 222 + * 223 + * Return: The original value of @v. 224 + */ 421 225 static __always_inline long 422 226 raw_atomic_long_fetch_sub_acquire(long i, atomic_long_t *v) 423 227 { ··· 439 221 #endif 440 222 } 441 223 224 + /** 225 + * raw_atomic_long_fetch_sub_release() - atomic subtract with release ordering 226 + * @i: long value to subtract 227 + * @v: pointer to atomic_long_t 228 + * 229 + * Atomically updates @v to (@v - @i) with release ordering. 230 + * 231 + * Safe to use in noinstr code; prefer atomic_long_fetch_sub_release() elsewhere. 232 + * 233 + * Return: The original value of @v. 234 + */ 442 235 static __always_inline long 443 236 raw_atomic_long_fetch_sub_release(long i, atomic_long_t *v) 444 237 { ··· 460 231 #endif 461 232 } 462 233 234 + /** 235 + * raw_atomic_long_fetch_sub_relaxed() - atomic subtract with relaxed ordering 236 + * @i: long value to subtract 237 + * @v: pointer to atomic_long_t 238 + * 239 + * Atomically updates @v to (@v - @i) with relaxed ordering. 240 + * 241 + * Safe to use in noinstr code; prefer atomic_long_fetch_sub_relaxed() elsewhere. 242 + * 243 + * Return: The original value of @v. 244 + */ 463 245 static __always_inline long 464 246 raw_atomic_long_fetch_sub_relaxed(long i, atomic_long_t *v) 465 247 { ··· 481 241 #endif 482 242 } 483 243 244 + /** 245 + * raw_atomic_long_inc() - atomic increment with relaxed ordering 246 + * @v: pointer to atomic_long_t 247 + * 248 + * Atomically updates @v to (@v + 1) with relaxed ordering. 249 + * 250 + * Safe to use in noinstr code; prefer atomic_long_inc() elsewhere. 251 + * 252 + * Return: Nothing. 253 + */ 484 254 static __always_inline void 485 255 raw_atomic_long_inc(atomic_long_t *v) 486 256 { ··· 501 251 #endif 502 252 } 503 253 254 + /** 255 + * raw_atomic_long_inc_return() - atomic increment with full ordering 256 + * @v: pointer to atomic_long_t 257 + * 258 + * Atomically updates @v to (@v + 1) with full ordering. 259 + * 260 + * Safe to use in noinstr code; prefer atomic_long_inc_return() elsewhere. 261 + * 262 + * Return: The updated value of @v. 263 + */ 504 264 static __always_inline long 505 265 raw_atomic_long_inc_return(atomic_long_t *v) 506 266 { ··· 521 261 #endif 522 262 } 523 263 264 + /** 265 + * raw_atomic_long_inc_return_acquire() - atomic increment with acquire ordering 266 + * @v: pointer to atomic_long_t 267 + * 268 + * Atomically updates @v to (@v + 1) with acquire ordering. 269 + * 270 + * Safe to use in noinstr code; prefer atomic_long_inc_return_acquire() elsewhere. 271 + * 272 + * Return: The updated value of @v. 273 + */ 524 274 static __always_inline long 525 275 raw_atomic_long_inc_return_acquire(atomic_long_t *v) 526 276 { ··· 541 271 #endif 542 272 } 543 273 274 + /** 275 + * raw_atomic_long_inc_return_release() - atomic increment with release ordering 276 + * @v: pointer to atomic_long_t 277 + * 278 + * Atomically updates @v to (@v + 1) with release ordering. 279 + * 280 + * Safe to use in noinstr code; prefer atomic_long_inc_return_release() elsewhere. 281 + * 282 + * Return: The updated value of @v. 283 + */ 544 284 static __always_inline long 545 285 raw_atomic_long_inc_return_release(atomic_long_t *v) 546 286 { ··· 561 281 #endif 562 282 } 563 283 284 + /** 285 + * raw_atomic_long_inc_return_relaxed() - atomic increment with relaxed ordering 286 + * @v: pointer to atomic_long_t 287 + * 288 + * Atomically updates @v to (@v + 1) with relaxed ordering. 289 + * 290 + * Safe to use in noinstr code; prefer atomic_long_inc_return_relaxed() elsewhere. 291 + * 292 + * Return: The updated value of @v. 293 + */ 564 294 static __always_inline long 565 295 raw_atomic_long_inc_return_relaxed(atomic_long_t *v) 566 296 { ··· 581 291 #endif 582 292 } 583 293 294 + /** 295 + * raw_atomic_long_fetch_inc() - atomic increment with full ordering 296 + * @v: pointer to atomic_long_t 297 + * 298 + * Atomically updates @v to (@v + 1) with full ordering. 299 + * 300 + * Safe to use in noinstr code; prefer atomic_long_fetch_inc() elsewhere. 301 + * 302 + * Return: The original value of @v. 303 + */ 584 304 static __always_inline long 585 305 raw_atomic_long_fetch_inc(atomic_long_t *v) 586 306 { ··· 601 301 #endif 602 302 } 603 303 304 + /** 305 + * raw_atomic_long_fetch_inc_acquire() - atomic increment with acquire ordering 306 + * @v: pointer to atomic_long_t 307 + * 308 + * Atomically updates @v to (@v + 1) with acquire ordering. 309 + * 310 + * Safe to use in noinstr code; prefer atomic_long_fetch_inc_acquire() elsewhere. 311 + * 312 + * Return: The original value of @v. 313 + */ 604 314 static __always_inline long 605 315 raw_atomic_long_fetch_inc_acquire(atomic_long_t *v) 606 316 { ··· 621 311 #endif 622 312 } 623 313 314 + /** 315 + * raw_atomic_long_fetch_inc_release() - atomic increment with release ordering 316 + * @v: pointer to atomic_long_t 317 + * 318 + * Atomically updates @v to (@v + 1) with release ordering. 319 + * 320 + * Safe to use in noinstr code; prefer atomic_long_fetch_inc_release() elsewhere. 321 + * 322 + * Return: The original value of @v. 323 + */ 624 324 static __always_inline long 625 325 raw_atomic_long_fetch_inc_release(atomic_long_t *v) 626 326 { ··· 641 321 #endif 642 322 } 643 323 324 + /** 325 + * raw_atomic_long_fetch_inc_relaxed() - atomic increment with relaxed ordering 326 + * @v: pointer to atomic_long_t 327 + * 328 + * Atomically updates @v to (@v + 1) with relaxed ordering. 329 + * 330 + * Safe to use in noinstr code; prefer atomic_long_fetch_inc_relaxed() elsewhere. 331 + * 332 + * Return: The original value of @v. 333 + */ 644 334 static __always_inline long 645 335 raw_atomic_long_fetch_inc_relaxed(atomic_long_t *v) 646 336 { ··· 661 331 #endif 662 332 } 663 333 334 + /** 335 + * raw_atomic_long_dec() - atomic decrement with relaxed ordering 336 + * @v: pointer to atomic_long_t 337 + * 338 + * Atomically updates @v to (@v - 1) with relaxed ordering. 339 + * 340 + * Safe to use in noinstr code; prefer atomic_long_dec() elsewhere. 341 + * 342 + * Return: Nothing. 343 + */ 664 344 static __always_inline void 665 345 raw_atomic_long_dec(atomic_long_t *v) 666 346 { ··· 681 341 #endif 682 342 } 683 343 344 + /** 345 + * raw_atomic_long_dec_return() - atomic decrement with full ordering 346 + * @v: pointer to atomic_long_t 347 + * 348 + * Atomically updates @v to (@v - 1) with full ordering. 349 + * 350 + * Safe to use in noinstr code; prefer atomic_long_dec_return() elsewhere. 351 + * 352 + * Return: The updated value of @v. 353 + */ 684 354 static __always_inline long 685 355 raw_atomic_long_dec_return(atomic_long_t *v) 686 356 { ··· 701 351 #endif 702 352 } 703 353 354 + /** 355 + * raw_atomic_long_dec_return_acquire() - atomic decrement with acquire ordering 356 + * @v: pointer to atomic_long_t 357 + * 358 + * Atomically updates @v to (@v - 1) with acquire ordering. 359 + * 360 + * Safe to use in noinstr code; prefer atomic_long_dec_return_acquire() elsewhere. 361 + * 362 + * Return: The updated value of @v. 363 + */ 704 364 static __always_inline long 705 365 raw_atomic_long_dec_return_acquire(atomic_long_t *v) 706 366 { ··· 721 361 #endif 722 362 } 723 363 364 + /** 365 + * raw_atomic_long_dec_return_release() - atomic decrement with release ordering 366 + * @v: pointer to atomic_long_t 367 + * 368 + * Atomically updates @v to (@v - 1) with release ordering. 369 + * 370 + * Safe to use in noinstr code; prefer atomic_long_dec_return_release() elsewhere. 371 + * 372 + * Return: The updated value of @v. 373 + */ 724 374 static __always_inline long 725 375 raw_atomic_long_dec_return_release(atomic_long_t *v) 726 376 { ··· 741 371 #endif 742 372 } 743 373 374 + /** 375 + * raw_atomic_long_dec_return_relaxed() - atomic decrement with relaxed ordering 376 + * @v: pointer to atomic_long_t 377 + * 378 + * Atomically updates @v to (@v - 1) with relaxed ordering. 379 + * 380 + * Safe to use in noinstr code; prefer atomic_long_dec_return_relaxed() elsewhere. 381 + * 382 + * Return: The updated value of @v. 383 + */ 744 384 static __always_inline long 745 385 raw_atomic_long_dec_return_relaxed(atomic_long_t *v) 746 386 { ··· 761 381 #endif 762 382 } 763 383 384 + /** 385 + * raw_atomic_long_fetch_dec() - atomic decrement with full ordering 386 + * @v: pointer to atomic_long_t 387 + * 388 + * Atomically updates @v to (@v - 1) with full ordering. 389 + * 390 + * Safe to use in noinstr code; prefer atomic_long_fetch_dec() elsewhere. 391 + * 392 + * Return: The original value of @v. 393 + */ 764 394 static __always_inline long 765 395 raw_atomic_long_fetch_dec(atomic_long_t *v) 766 396 { ··· 781 391 #endif 782 392 } 783 393 394 + /** 395 + * raw_atomic_long_fetch_dec_acquire() - atomic decrement with acquire ordering 396 + * @v: pointer to atomic_long_t 397 + * 398 + * Atomically updates @v to (@v - 1) with acquire ordering. 399 + * 400 + * Safe to use in noinstr code; prefer atomic_long_fetch_dec_acquire() elsewhere. 401 + * 402 + * Return: The original value of @v. 403 + */ 784 404 static __always_inline long 785 405 raw_atomic_long_fetch_dec_acquire(atomic_long_t *v) 786 406 { ··· 801 401 #endif 802 402 } 803 403 404 + /** 405 + * raw_atomic_long_fetch_dec_release() - atomic decrement with release ordering 406 + * @v: pointer to atomic_long_t 407 + * 408 + * Atomically updates @v to (@v - 1) with release ordering. 409 + * 410 + * Safe to use in noinstr code; prefer atomic_long_fetch_dec_release() elsewhere. 411 + * 412 + * Return: The original value of @v. 413 + */ 804 414 static __always_inline long 805 415 raw_atomic_long_fetch_dec_release(atomic_long_t *v) 806 416 { ··· 821 411 #endif 822 412 } 823 413 414 + /** 415 + * raw_atomic_long_fetch_dec_relaxed() - atomic decrement with relaxed ordering 416 + * @v: pointer to atomic_long_t 417 + * 418 + * Atomically updates @v to (@v - 1) with relaxed ordering. 419 + * 420 + * Safe to use in noinstr code; prefer atomic_long_fetch_dec_relaxed() elsewhere. 421 + * 422 + * Return: The original value of @v. 423 + */ 824 424 static __always_inline long 825 425 raw_atomic_long_fetch_dec_relaxed(atomic_long_t *v) 826 426 { ··· 841 421 #endif 842 422 } 843 423 424 + /** 425 + * raw_atomic_long_and() - atomic bitwise AND with relaxed ordering 426 + * @i: long value 427 + * @v: pointer to atomic_long_t 428 + * 429 + * Atomically updates @v to (@v & @i) with relaxed ordering. 430 + * 431 + * Safe to use in noinstr code; prefer atomic_long_and() elsewhere. 432 + * 433 + * Return: Nothing. 434 + */ 844 435 static __always_inline void 845 436 raw_atomic_long_and(long i, atomic_long_t *v) 846 437 { ··· 862 431 #endif 863 432 } 864 433 434 + /** 435 + * raw_atomic_long_fetch_and() - atomic bitwise AND with full ordering 436 + * @i: long value 437 + * @v: pointer to atomic_long_t 438 + * 439 + * Atomically updates @v to (@v & @i) with full ordering. 440 + * 441 + * Safe to use in noinstr code; prefer atomic_long_fetch_and() elsewhere. 442 + * 443 + * Return: The original value of @v. 444 + */ 865 445 static __always_inline long 866 446 raw_atomic_long_fetch_and(long i, atomic_long_t *v) 867 447 { ··· 883 441 #endif 884 442 } 885 443 444 + /** 445 + * raw_atomic_long_fetch_and_acquire() - atomic bitwise AND with acquire ordering 446 + * @i: long value 447 + * @v: pointer to atomic_long_t 448 + * 449 + * Atomically updates @v to (@v & @i) with acquire ordering. 450 + * 451 + * Safe to use in noinstr code; prefer atomic_long_fetch_and_acquire() elsewhere. 452 + * 453 + * Return: The original value of @v. 454 + */ 886 455 static __always_inline long 887 456 raw_atomic_long_fetch_and_acquire(long i, atomic_long_t *v) 888 457 { ··· 904 451 #endif 905 452 } 906 453 454 + /** 455 + * raw_atomic_long_fetch_and_release() - atomic bitwise AND with release ordering 456 + * @i: long value 457 + * @v: pointer to atomic_long_t 458 + * 459 + * Atomically updates @v to (@v & @i) with release ordering. 460 + * 461 + * Safe to use in noinstr code; prefer atomic_long_fetch_and_release() elsewhere. 462 + * 463 + * Return: The original value of @v. 464 + */ 907 465 static __always_inline long 908 466 raw_atomic_long_fetch_and_release(long i, atomic_long_t *v) 909 467 { ··· 925 461 #endif 926 462 } 927 463 464 + /** 465 + * raw_atomic_long_fetch_and_relaxed() - atomic bitwise AND with relaxed ordering 466 + * @i: long value 467 + * @v: pointer to atomic_long_t 468 + * 469 + * Atomically updates @v to (@v & @i) with relaxed ordering. 470 + * 471 + * Safe to use in noinstr code; prefer atomic_long_fetch_and_relaxed() elsewhere. 472 + * 473 + * Return: The original value of @v. 474 + */ 928 475 static __always_inline long 929 476 raw_atomic_long_fetch_and_relaxed(long i, atomic_long_t *v) 930 477 { ··· 946 471 #endif 947 472 } 948 473 474 + /** 475 + * raw_atomic_long_andnot() - atomic bitwise AND NOT with relaxed ordering 476 + * @i: long value 477 + * @v: pointer to atomic_long_t 478 + * 479 + * Atomically updates @v to (@v & ~@i) with relaxed ordering. 480 + * 481 + * Safe to use in noinstr code; prefer atomic_long_andnot() elsewhere. 482 + * 483 + * Return: Nothing. 484 + */ 949 485 static __always_inline void 950 486 raw_atomic_long_andnot(long i, atomic_long_t *v) 951 487 { ··· 967 481 #endif 968 482 } 969 483 484 + /** 485 + * raw_atomic_long_fetch_andnot() - atomic bitwise AND NOT with full ordering 486 + * @i: long value 487 + * @v: pointer to atomic_long_t 488 + * 489 + * Atomically updates @v to (@v & ~@i) with full ordering. 490 + * 491 + * Safe to use in noinstr code; prefer atomic_long_fetch_andnot() elsewhere. 492 + * 493 + * Return: The original value of @v. 494 + */ 970 495 static __always_inline long 971 496 raw_atomic_long_fetch_andnot(long i, atomic_long_t *v) 972 497 { ··· 988 491 #endif 989 492 } 990 493 494 + /** 495 + * raw_atomic_long_fetch_andnot_acquire() - atomic bitwise AND NOT with acquire ordering 496 + * @i: long value 497 + * @v: pointer to atomic_long_t 498 + * 499 + * Atomically updates @v to (@v & ~@i) with acquire ordering. 500 + * 501 + * Safe to use in noinstr code; prefer atomic_long_fetch_andnot_acquire() elsewhere. 502 + * 503 + * Return: The original value of @v. 504 + */ 991 505 static __always_inline long 992 506 raw_atomic_long_fetch_andnot_acquire(long i, atomic_long_t *v) 993 507 { ··· 1009 501 #endif 1010 502 } 1011 503 504 + /** 505 + * raw_atomic_long_fetch_andnot_release() - atomic bitwise AND NOT with release ordering 506 + * @i: long value 507 + * @v: pointer to atomic_long_t 508 + * 509 + * Atomically updates @v to (@v & ~@i) with release ordering. 510 + * 511 + * Safe to use in noinstr code; prefer atomic_long_fetch_andnot_release() elsewhere. 512 + * 513 + * Return: The original value of @v. 514 + */ 1012 515 static __always_inline long 1013 516 raw_atomic_long_fetch_andnot_release(long i, atomic_long_t *v) 1014 517 { ··· 1030 511 #endif 1031 512 } 1032 513 514 + /** 515 + * raw_atomic_long_fetch_andnot_relaxed() - atomic bitwise AND NOT with relaxed ordering 516 + * @i: long value 517 + * @v: pointer to atomic_long_t 518 + * 519 + * Atomically updates @v to (@v & ~@i) with relaxed ordering. 520 + * 521 + * Safe to use in noinstr code; prefer atomic_long_fetch_andnot_relaxed() elsewhere. 522 + * 523 + * Return: The original value of @v. 524 + */ 1033 525 static __always_inline long 1034 526 raw_atomic_long_fetch_andnot_relaxed(long i, atomic_long_t *v) 1035 527 { ··· 1051 521 #endif 1052 522 } 1053 523 524 + /** 525 + * raw_atomic_long_or() - atomic bitwise OR with relaxed ordering 526 + * @i: long value 527 + * @v: pointer to atomic_long_t 528 + * 529 + * Atomically updates @v to (@v | @i) with relaxed ordering. 530 + * 531 + * Safe to use in noinstr code; prefer atomic_long_or() elsewhere. 532 + * 533 + * Return: Nothing. 534 + */ 1054 535 static __always_inline void 1055 536 raw_atomic_long_or(long i, atomic_long_t *v) 1056 537 { ··· 1072 531 #endif 1073 532 } 1074 533 534 + /** 535 + * raw_atomic_long_fetch_or() - atomic bitwise OR with full ordering 536 + * @i: long value 537 + * @v: pointer to atomic_long_t 538 + * 539 + * Atomically updates @v to (@v | @i) with full ordering. 540 + * 541 + * Safe to use in noinstr code; prefer atomic_long_fetch_or() elsewhere. 542 + * 543 + * Return: The original value of @v. 544 + */ 1075 545 static __always_inline long 1076 546 raw_atomic_long_fetch_or(long i, atomic_long_t *v) 1077 547 { ··· 1093 541 #endif 1094 542 } 1095 543 544 + /** 545 + * raw_atomic_long_fetch_or_acquire() - atomic bitwise OR with acquire ordering 546 + * @i: long value 547 + * @v: pointer to atomic_long_t 548 + * 549 + * Atomically updates @v to (@v | @i) with acquire ordering. 550 + * 551 + * Safe to use in noinstr code; prefer atomic_long_fetch_or_acquire() elsewhere. 552 + * 553 + * Return: The original value of @v. 554 + */ 1096 555 static __always_inline long 1097 556 raw_atomic_long_fetch_or_acquire(long i, atomic_long_t *v) 1098 557 { ··· 1114 551 #endif 1115 552 } 1116 553 554 + /** 555 + * raw_atomic_long_fetch_or_release() - atomic bitwise OR with release ordering 556 + * @i: long value 557 + * @v: pointer to atomic_long_t 558 + * 559 + * Atomically updates @v to (@v | @i) with release ordering. 560 + * 561 + * Safe to use in noinstr code; prefer atomic_long_fetch_or_release() elsewhere. 562 + * 563 + * Return: The original value of @v. 564 + */ 1117 565 static __always_inline long 1118 566 raw_atomic_long_fetch_or_release(long i, atomic_long_t *v) 1119 567 { ··· 1135 561 #endif 1136 562 } 1137 563 564 + /** 565 + * raw_atomic_long_fetch_or_relaxed() - atomic bitwise OR with relaxed ordering 566 + * @i: long value 567 + * @v: pointer to atomic_long_t 568 + * 569 + * Atomically updates @v to (@v | @i) with relaxed ordering. 570 + * 571 + * Safe to use in noinstr code; prefer atomic_long_fetch_or_relaxed() elsewhere. 572 + * 573 + * Return: The original value of @v. 574 + */ 1138 575 static __always_inline long 1139 576 raw_atomic_long_fetch_or_relaxed(long i, atomic_long_t *v) 1140 577 { ··· 1156 571 #endif 1157 572 } 1158 573 574 + /** 575 + * raw_atomic_long_xor() - atomic bitwise XOR with relaxed ordering 576 + * @i: long value 577 + * @v: pointer to atomic_long_t 578 + * 579 + * Atomically updates @v to (@v ^ @i) with relaxed ordering. 580 + * 581 + * Safe to use in noinstr code; prefer atomic_long_xor() elsewhere. 582 + * 583 + * Return: Nothing. 584 + */ 1159 585 static __always_inline void 1160 586 raw_atomic_long_xor(long i, atomic_long_t *v) 1161 587 { ··· 1177 581 #endif 1178 582 } 1179 583 584 + /** 585 + * raw_atomic_long_fetch_xor() - atomic bitwise XOR with full ordering 586 + * @i: long value 587 + * @v: pointer to atomic_long_t 588 + * 589 + * Atomically updates @v to (@v ^ @i) with full ordering. 590 + * 591 + * Safe to use in noinstr code; prefer atomic_long_fetch_xor() elsewhere. 592 + * 593 + * Return: The original value of @v. 594 + */ 1180 595 static __always_inline long 1181 596 raw_atomic_long_fetch_xor(long i, atomic_long_t *v) 1182 597 { ··· 1198 591 #endif 1199 592 } 1200 593 594 + /** 595 + * raw_atomic_long_fetch_xor_acquire() - atomic bitwise XOR with acquire ordering 596 + * @i: long value 597 + * @v: pointer to atomic_long_t 598 + * 599 + * Atomically updates @v to (@v ^ @i) with acquire ordering. 600 + * 601 + * Safe to use in noinstr code; prefer atomic_long_fetch_xor_acquire() elsewhere. 602 + * 603 + * Return: The original value of @v. 604 + */ 1201 605 static __always_inline long 1202 606 raw_atomic_long_fetch_xor_acquire(long i, atomic_long_t *v) 1203 607 { ··· 1219 601 #endif 1220 602 } 1221 603 604 + /** 605 + * raw_atomic_long_fetch_xor_release() - atomic bitwise XOR with release ordering 606 + * @i: long value 607 + * @v: pointer to atomic_long_t 608 + * 609 + * Atomically updates @v to (@v ^ @i) with release ordering. 610 + * 611 + * Safe to use in noinstr code; prefer atomic_long_fetch_xor_release() elsewhere. 612 + * 613 + * Return: The original value of @v. 614 + */ 1222 615 static __always_inline long 1223 616 raw_atomic_long_fetch_xor_release(long i, atomic_long_t *v) 1224 617 { ··· 1240 611 #endif 1241 612 } 1242 613 614 + /** 615 + * raw_atomic_long_fetch_xor_relaxed() - atomic bitwise XOR with relaxed ordering 616 + * @i: long value 617 + * @v: pointer to atomic_long_t 618 + * 619 + * Atomically updates @v to (@v ^ @i) with relaxed ordering. 620 + * 621 + * Safe to use in noinstr code; prefer atomic_long_fetch_xor_relaxed() elsewhere. 622 + * 623 + * Return: The original value of @v. 624 + */ 1243 625 static __always_inline long 1244 626 raw_atomic_long_fetch_xor_relaxed(long i, atomic_long_t *v) 1245 627 { ··· 1261 621 #endif 1262 622 } 1263 623 624 + /** 625 + * raw_atomic_long_xchg() - atomic exchange with full ordering 626 + * @v: pointer to atomic_long_t 627 + * @new: long value to assign 628 + * 629 + * Atomically updates @v to @new with full ordering. 630 + * 631 + * Safe to use in noinstr code; prefer atomic_long_xchg() elsewhere. 632 + * 633 + * Return: The original value of @v. 634 + */ 1264 635 static __always_inline long 1265 636 raw_atomic_long_xchg(atomic_long_t *v, long new) 1266 637 { ··· 1282 631 #endif 1283 632 } 1284 633 634 + /** 635 + * raw_atomic_long_xchg_acquire() - atomic exchange with acquire ordering 636 + * @v: pointer to atomic_long_t 637 + * @new: long value to assign 638 + * 639 + * Atomically updates @v to @new with acquire ordering. 640 + * 641 + * Safe to use in noinstr code; prefer atomic_long_xchg_acquire() elsewhere. 642 + * 643 + * Return: The original value of @v. 644 + */ 1285 645 static __always_inline long 1286 646 raw_atomic_long_xchg_acquire(atomic_long_t *v, long new) 1287 647 { ··· 1303 641 #endif 1304 642 } 1305 643 644 + /** 645 + * raw_atomic_long_xchg_release() - atomic exchange with release ordering 646 + * @v: pointer to atomic_long_t 647 + * @new: long value to assign 648 + * 649 + * Atomically updates @v to @new with release ordering. 650 + * 651 + * Safe to use in noinstr code; prefer atomic_long_xchg_release() elsewhere. 652 + * 653 + * Return: The original value of @v. 654 + */ 1306 655 static __always_inline long 1307 656 raw_atomic_long_xchg_release(atomic_long_t *v, long new) 1308 657 { ··· 1324 651 #endif 1325 652 } 1326 653 654 + /** 655 + * raw_atomic_long_xchg_relaxed() - atomic exchange with relaxed ordering 656 + * @v: pointer to atomic_long_t 657 + * @new: long value to assign 658 + * 659 + * Atomically updates @v to @new with relaxed ordering. 660 + * 661 + * Safe to use in noinstr code; prefer atomic_long_xchg_relaxed() elsewhere. 662 + * 663 + * Return: The original value of @v. 664 + */ 1327 665 static __always_inline long 1328 666 raw_atomic_long_xchg_relaxed(atomic_long_t *v, long new) 1329 667 { ··· 1345 661 #endif 1346 662 } 1347 663 664 + /** 665 + * raw_atomic_long_cmpxchg() - atomic compare and exchange with full ordering 666 + * @v: pointer to atomic_long_t 667 + * @old: long value to compare with 668 + * @new: long value to assign 669 + * 670 + * If (@v == @old), atomically updates @v to @new with full ordering. 671 + * 672 + * Safe to use in noinstr code; prefer atomic_long_cmpxchg() elsewhere. 673 + * 674 + * Return: The original value of @v. 675 + */ 1348 676 static __always_inline long 1349 677 raw_atomic_long_cmpxchg(atomic_long_t *v, long old, long new) 1350 678 { ··· 1367 671 #endif 1368 672 } 1369 673 674 + /** 675 + * raw_atomic_long_cmpxchg_acquire() - atomic compare and exchange with acquire ordering 676 + * @v: pointer to atomic_long_t 677 + * @old: long value to compare with 678 + * @new: long value to assign 679 + * 680 + * If (@v == @old), atomically updates @v to @new with acquire ordering. 681 + * 682 + * Safe to use in noinstr code; prefer atomic_long_cmpxchg_acquire() elsewhere. 683 + * 684 + * Return: The original value of @v. 685 + */ 1370 686 static __always_inline long 1371 687 raw_atomic_long_cmpxchg_acquire(atomic_long_t *v, long old, long new) 1372 688 { ··· 1389 681 #endif 1390 682 } 1391 683 684 + /** 685 + * raw_atomic_long_cmpxchg_release() - atomic compare and exchange with release ordering 686 + * @v: pointer to atomic_long_t 687 + * @old: long value to compare with 688 + * @new: long value to assign 689 + * 690 + * If (@v == @old), atomically updates @v to @new with release ordering. 691 + * 692 + * Safe to use in noinstr code; prefer atomic_long_cmpxchg_release() elsewhere. 693 + * 694 + * Return: The original value of @v. 695 + */ 1392 696 static __always_inline long 1393 697 raw_atomic_long_cmpxchg_release(atomic_long_t *v, long old, long new) 1394 698 { ··· 1411 691 #endif 1412 692 } 1413 693 694 + /** 695 + * raw_atomic_long_cmpxchg_relaxed() - atomic compare and exchange with relaxed ordering 696 + * @v: pointer to atomic_long_t 697 + * @old: long value to compare with 698 + * @new: long value to assign 699 + * 700 + * If (@v == @old), atomically updates @v to @new with relaxed ordering. 701 + * 702 + * Safe to use in noinstr code; prefer atomic_long_cmpxchg_relaxed() elsewhere. 703 + * 704 + * Return: The original value of @v. 705 + */ 1414 706 static __always_inline long 1415 707 raw_atomic_long_cmpxchg_relaxed(atomic_long_t *v, long old, long new) 1416 708 { ··· 1433 701 #endif 1434 702 } 1435 703 704 + /** 705 + * raw_atomic_long_try_cmpxchg() - atomic compare and exchange with full ordering 706 + * @v: pointer to atomic_long_t 707 + * @old: pointer to long value to compare with 708 + * @new: long value to assign 709 + * 710 + * If (@v == @old), atomically updates @v to @new with full ordering. 711 + * Otherwise, updates @old to the current value of @v. 712 + * 713 + * Safe to use in noinstr code; prefer atomic_long_try_cmpxchg() elsewhere. 714 + * 715 + * Return: @true if the exchange occured, @false otherwise. 716 + */ 1436 717 static __always_inline bool 1437 718 raw_atomic_long_try_cmpxchg(atomic_long_t *v, long *old, long new) 1438 719 { ··· 1456 711 #endif 1457 712 } 1458 713 714 + /** 715 + * raw_atomic_long_try_cmpxchg_acquire() - atomic compare and exchange with acquire ordering 716 + * @v: pointer to atomic_long_t 717 + * @old: pointer to long value to compare with 718 + * @new: long value to assign 719 + * 720 + * If (@v == @old), atomically updates @v to @new with acquire ordering. 721 + * Otherwise, updates @old to the current value of @v. 722 + * 723 + * Safe to use in noinstr code; prefer atomic_long_try_cmpxchg_acquire() elsewhere. 724 + * 725 + * Return: @true if the exchange occured, @false otherwise. 726 + */ 1459 727 static __always_inline bool 1460 728 raw_atomic_long_try_cmpxchg_acquire(atomic_long_t *v, long *old, long new) 1461 729 { ··· 1479 721 #endif 1480 722 } 1481 723 724 + /** 725 + * raw_atomic_long_try_cmpxchg_release() - atomic compare and exchange with release ordering 726 + * @v: pointer to atomic_long_t 727 + * @old: pointer to long value to compare with 728 + * @new: long value to assign 729 + * 730 + * If (@v == @old), atomically updates @v to @new with release ordering. 731 + * Otherwise, updates @old to the current value of @v. 732 + * 733 + * Safe to use in noinstr code; prefer atomic_long_try_cmpxchg_release() elsewhere. 734 + * 735 + * Return: @true if the exchange occured, @false otherwise. 736 + */ 1482 737 static __always_inline bool 1483 738 raw_atomic_long_try_cmpxchg_release(atomic_long_t *v, long *old, long new) 1484 739 { ··· 1502 731 #endif 1503 732 } 1504 733 734 + /** 735 + * raw_atomic_long_try_cmpxchg_relaxed() - atomic compare and exchange with relaxed ordering 736 + * @v: pointer to atomic_long_t 737 + * @old: pointer to long value to compare with 738 + * @new: long value to assign 739 + * 740 + * If (@v == @old), atomically updates @v to @new with relaxed ordering. 741 + * Otherwise, updates @old to the current value of @v. 742 + * 743 + * Safe to use in noinstr code; prefer atomic_long_try_cmpxchg_relaxed() elsewhere. 744 + * 745 + * Return: @true if the exchange occured, @false otherwise. 746 + */ 1505 747 static __always_inline bool 1506 748 raw_atomic_long_try_cmpxchg_relaxed(atomic_long_t *v, long *old, long new) 1507 749 { ··· 1525 741 #endif 1526 742 } 1527 743 744 + /** 745 + * raw_atomic_long_sub_and_test() - atomic subtract and test if zero with full ordering 746 + * @i: long value to add 747 + * @v: pointer to atomic_long_t 748 + * 749 + * Atomically updates @v to (@v - @i) with full ordering. 750 + * 751 + * Safe to use in noinstr code; prefer atomic_long_sub_and_test() elsewhere. 752 + * 753 + * Return: @true if the resulting value of @v is zero, @false otherwise. 754 + */ 1528 755 static __always_inline bool 1529 756 raw_atomic_long_sub_and_test(long i, atomic_long_t *v) 1530 757 { ··· 1546 751 #endif 1547 752 } 1548 753 754 + /** 755 + * raw_atomic_long_dec_and_test() - atomic decrement and test if zero with full ordering 756 + * @v: pointer to atomic_long_t 757 + * 758 + * Atomically updates @v to (@v - 1) with full ordering. 759 + * 760 + * Safe to use in noinstr code; prefer atomic_long_dec_and_test() elsewhere. 761 + * 762 + * Return: @true if the resulting value of @v is zero, @false otherwise. 763 + */ 1549 764 static __always_inline bool 1550 765 raw_atomic_long_dec_and_test(atomic_long_t *v) 1551 766 { ··· 1566 761 #endif 1567 762 } 1568 763 764 + /** 765 + * raw_atomic_long_inc_and_test() - atomic increment and test if zero with full ordering 766 + * @v: pointer to atomic_long_t 767 + * 768 + * Atomically updates @v to (@v + 1) with full ordering. 769 + * 770 + * Safe to use in noinstr code; prefer atomic_long_inc_and_test() elsewhere. 771 + * 772 + * Return: @true if the resulting value of @v is zero, @false otherwise. 773 + */ 1569 774 static __always_inline bool 1570 775 raw_atomic_long_inc_and_test(atomic_long_t *v) 1571 776 { ··· 1586 771 #endif 1587 772 } 1588 773 774 + /** 775 + * raw_atomic_long_add_negative() - atomic add and test if negative with full ordering 776 + * @i: long value to add 777 + * @v: pointer to atomic_long_t 778 + * 779 + * Atomically updates @v to (@v + @i) with full ordering. 780 + * 781 + * Safe to use in noinstr code; prefer atomic_long_add_negative() elsewhere. 782 + * 783 + * Return: @true if the resulting value of @v is negative, @false otherwise. 784 + */ 1589 785 static __always_inline bool 1590 786 raw_atomic_long_add_negative(long i, atomic_long_t *v) 1591 787 { ··· 1607 781 #endif 1608 782 } 1609 783 784 + /** 785 + * raw_atomic_long_add_negative_acquire() - atomic add and test if negative with acquire ordering 786 + * @i: long value to add 787 + * @v: pointer to atomic_long_t 788 + * 789 + * Atomically updates @v to (@v + @i) with acquire ordering. 790 + * 791 + * Safe to use in noinstr code; prefer atomic_long_add_negative_acquire() elsewhere. 792 + * 793 + * Return: @true if the resulting value of @v is negative, @false otherwise. 794 + */ 1610 795 static __always_inline bool 1611 796 raw_atomic_long_add_negative_acquire(long i, atomic_long_t *v) 1612 797 { ··· 1628 791 #endif 1629 792 } 1630 793 794 + /** 795 + * raw_atomic_long_add_negative_release() - atomic add and test if negative with release ordering 796 + * @i: long value to add 797 + * @v: pointer to atomic_long_t 798 + * 799 + * Atomically updates @v to (@v + @i) with release ordering. 800 + * 801 + * Safe to use in noinstr code; prefer atomic_long_add_negative_release() elsewhere. 802 + * 803 + * Return: @true if the resulting value of @v is negative, @false otherwise. 804 + */ 1631 805 static __always_inline bool 1632 806 raw_atomic_long_add_negative_release(long i, atomic_long_t *v) 1633 807 { ··· 1649 801 #endif 1650 802 } 1651 803 804 + /** 805 + * raw_atomic_long_add_negative_relaxed() - atomic add and test if negative with relaxed ordering 806 + * @i: long value to add 807 + * @v: pointer to atomic_long_t 808 + * 809 + * Atomically updates @v to (@v + @i) with relaxed ordering. 810 + * 811 + * Safe to use in noinstr code; prefer atomic_long_add_negative_relaxed() elsewhere. 812 + * 813 + * Return: @true if the resulting value of @v is negative, @false otherwise. 814 + */ 1652 815 static __always_inline bool 1653 816 raw_atomic_long_add_negative_relaxed(long i, atomic_long_t *v) 1654 817 { ··· 1670 811 #endif 1671 812 } 1672 813 814 + /** 815 + * raw_atomic_long_fetch_add_unless() - atomic add unless value with full ordering 816 + * @v: pointer to atomic_long_t 817 + * @a: long value to add 818 + * @u: long value to compare with 819 + * 820 + * If (@v != @u), atomically updates @v to (@v + @a) with full ordering. 821 + * 822 + * Safe to use in noinstr code; prefer atomic_long_fetch_add_unless() elsewhere. 823 + * 824 + * Return: The original value of @v. 825 + */ 1673 826 static __always_inline long 1674 827 raw_atomic_long_fetch_add_unless(atomic_long_t *v, long a, long u) 1675 828 { ··· 1692 821 #endif 1693 822 } 1694 823 824 + /** 825 + * raw_atomic_long_add_unless() - atomic add unless value with full ordering 826 + * @v: pointer to atomic_long_t 827 + * @a: long value to add 828 + * @u: long value to compare with 829 + * 830 + * If (@v != @u), atomically updates @v to (@v + @a) with full ordering. 831 + * 832 + * Safe to use in noinstr code; prefer atomic_long_add_unless() elsewhere. 833 + * 834 + * Return: @true if @v was updated, @false otherwise. 835 + */ 1695 836 static __always_inline bool 1696 837 raw_atomic_long_add_unless(atomic_long_t *v, long a, long u) 1697 838 { ··· 1714 831 #endif 1715 832 } 1716 833 834 + /** 835 + * raw_atomic_long_inc_not_zero() - atomic increment unless zero with full ordering 836 + * @v: pointer to atomic_long_t 837 + * 838 + * If (@v != 0), atomically updates @v to (@v + 1) with full ordering. 839 + * 840 + * Safe to use in noinstr code; prefer atomic_long_inc_not_zero() elsewhere. 841 + * 842 + * Return: @true if @v was updated, @false otherwise. 843 + */ 1717 844 static __always_inline bool 1718 845 raw_atomic_long_inc_not_zero(atomic_long_t *v) 1719 846 { ··· 1734 841 #endif 1735 842 } 1736 843 844 + /** 845 + * raw_atomic_long_inc_unless_negative() - atomic increment unless negative with full ordering 846 + * @v: pointer to atomic_long_t 847 + * 848 + * If (@v >= 0), atomically updates @v to (@v + 1) with full ordering. 849 + * 850 + * Safe to use in noinstr code; prefer atomic_long_inc_unless_negative() elsewhere. 851 + * 852 + * Return: @true if @v was updated, @false otherwise. 853 + */ 1737 854 static __always_inline bool 1738 855 raw_atomic_long_inc_unless_negative(atomic_long_t *v) 1739 856 { ··· 1754 851 #endif 1755 852 } 1756 853 854 + /** 855 + * raw_atomic_long_dec_unless_positive() - atomic decrement unless positive with full ordering 856 + * @v: pointer to atomic_long_t 857 + * 858 + * If (@v <= 0), atomically updates @v to (@v - 1) with full ordering. 859 + * 860 + * Safe to use in noinstr code; prefer atomic_long_dec_unless_positive() elsewhere. 861 + * 862 + * Return: @true if @v was updated, @false otherwise. 863 + */ 1757 864 static __always_inline bool 1758 865 raw_atomic_long_dec_unless_positive(atomic_long_t *v) 1759 866 { ··· 1774 861 #endif 1775 862 } 1776 863 864 + /** 865 + * raw_atomic_long_dec_if_positive() - atomic decrement if positive with full ordering 866 + * @v: pointer to atomic_long_t 867 + * 868 + * If (@v > 0), atomically updates @v to (@v - 1) with full ordering. 869 + * 870 + * Safe to use in noinstr code; prefer atomic_long_dec_if_positive() elsewhere. 871 + * 872 + * Return: @true if @v was updated, @false otherwise. 873 + */ 1777 874 static __always_inline long 1778 875 raw_atomic_long_dec_if_positive(atomic_long_t *v) 1779 876 { ··· 1795 872 } 1796 873 1797 874 #endif /* _LINUX_ATOMIC_LONG_H */ 1798 - // e785d25cc3f220b7d473d36aac9da85dd7eb13a8 875 + // 029d2e3a493086671e874a4c2e0e42084be42403
+108 -4
scripts/atomic/atomic-tbl.sh
··· 36 36 meta_in "$1" "BFIR" 37 37 } 38 38 39 - #find_fallback_template(pfx, name, sfx, order) 40 - find_fallback_template() 39 + #meta_is_implicitly_relaxed(meta) 40 + meta_is_implicitly_relaxed() 41 41 { 42 + meta_in "$1" "vls" 43 + } 44 + 45 + #find_template(tmpltype, pfx, name, sfx, order) 46 + find_template() 47 + { 48 + local tmpltype="$1"; shift 42 49 local pfx="$1"; shift 43 50 local name="$1"; shift 44 51 local sfx="$1"; shift ··· 59 52 # 60 53 # Start at the most specific, and fall back to the most general. Once 61 54 # we find a specific fallback, don't bother looking for more. 62 - for base in "${pfx}${name}${sfx}${order}" "${name}"; do 63 - file="${ATOMICDIR}/fallbacks/${base}" 55 + for base in "${pfx}${name}${sfx}${order}" "${pfx}${name}${sfx}" "${name}"; do 56 + file="${ATOMICDIR}/${tmpltype}/${base}" 64 57 65 58 if [ -f "${file}" ]; then 66 59 printf "${file}" 67 60 break 68 61 fi 69 62 done 63 + } 64 + 65 + #find_fallback_template(pfx, name, sfx, order) 66 + find_fallback_template() 67 + { 68 + find_template "fallbacks" "$@" 69 + } 70 + 71 + #find_kerneldoc_template(pfx, name, sfx, order) 72 + find_kerneldoc_template() 73 + { 74 + find_template "kerneldoc" "$@" 70 75 } 71 76 72 77 #gen_ret_type(meta, int) ··· 159 140 [ "$#" -gt 1 ] && printf ", " 160 141 shift; 161 142 done 143 + } 144 + 145 + #gen_desc_return(meta) 146 + gen_desc_return() 147 + { 148 + local meta="$1"; shift 149 + 150 + case "${meta}" in 151 + [v]) 152 + printf "Return: Nothing." 153 + ;; 154 + [Ff]) 155 + printf "Return: The original value of @v." 156 + ;; 157 + [R]) 158 + printf "Return: The updated value of @v." 159 + ;; 160 + [l]) 161 + printf "Return: The value of @v." 162 + ;; 163 + esac 164 + } 165 + 166 + #gen_template_kerneldoc(template, class, meta, pfx, name, sfx, order, atomic, int, args...) 167 + gen_template_kerneldoc() 168 + { 169 + local template="$1"; shift 170 + local class="$1"; shift 171 + local meta="$1"; shift 172 + local pfx="$1"; shift 173 + local name="$1"; shift 174 + local sfx="$1"; shift 175 + local order="$1"; shift 176 + local atomic="$1"; shift 177 + local int="$1"; shift 178 + 179 + local atomicname="${atomic}_${pfx}${name}${sfx}${order}" 180 + 181 + local ret="$(gen_ret_type "${meta}" "${int}")" 182 + local retstmt="$(gen_ret_stmt "${meta}")" 183 + local params="$(gen_params "${int}" "${atomic}" "$@")" 184 + local args="$(gen_args "$@")" 185 + local desc_order="" 186 + local desc_instrumentation="" 187 + local desc_return="" 188 + 189 + if [ ! -z "${order}" ]; then 190 + desc_order="${order##_}" 191 + elif meta_is_implicitly_relaxed "${meta}"; then 192 + desc_order="relaxed" 193 + else 194 + desc_order="full" 195 + fi 196 + 197 + if [ -z "${class}" ]; then 198 + desc_noinstr="Unsafe to use in noinstr code; use raw_${atomicname}() there." 199 + else 200 + desc_noinstr="Safe to use in noinstr code; prefer ${atomicname}() elsewhere." 201 + fi 202 + 203 + desc_return="$(gen_desc_return "${meta}")" 204 + 205 + . ${template} 206 + } 207 + 208 + #gen_kerneldoc(class, meta, pfx, name, sfx, order, atomic, int, args...) 209 + gen_kerneldoc() 210 + { 211 + local class="$1"; shift 212 + local meta="$1"; shift 213 + local pfx="$1"; shift 214 + local name="$1"; shift 215 + local sfx="$1"; shift 216 + local order="$1"; shift 217 + 218 + local atomicname="${atomic}_${pfx}${name}${sfx}${order}" 219 + 220 + local tmpl="$(find_kerneldoc_template "${pfx}" "${name}" "${sfx}" "${order}")" 221 + if [ -z "${tmpl}" ]; then 222 + printf "/*\n" 223 + printf " * No kerneldoc available for ${class}${atomicname}\n" 224 + printf " */\n" 225 + else 226 + gen_template_kerneldoc "${tmpl}" "${class}" "${meta}" "${pfx}" "${name}" "${sfx}" "${order}" "$@" 227 + fi 162 228 } 163 229 164 230 #gen_proto_order_variants(meta, pfx, name, sfx, ...)
+2
scripts/atomic/gen-atomic-fallback.sh
··· 73 73 local params="$(gen_params "${int}" "${atomic}" "$@")" 74 74 local args="$(gen_args "$@")" 75 75 76 + gen_kerneldoc "raw_" "${meta}" "${pfx}" "${name}" "${sfx}" "${order}" "${atomic}" "${int}" "$@" 77 + 76 78 printf "static __always_inline ${ret}\n" 77 79 printf "raw_${atomicname}(${params})\n" 78 80 printf "{\n"
+2
scripts/atomic/gen-atomic-instrumented.sh
··· 68 68 local args="$(gen_args "$@")" 69 69 local retstmt="$(gen_ret_stmt "${meta}")" 70 70 71 + gen_kerneldoc "" "${meta}" "${pfx}" "${name}" "${sfx}" "${order}" "${atomic}" "${int}" "$@" 72 + 71 73 cat <<EOF 72 74 static __always_inline ${ret} 73 75 ${atomicname}(${params})
+2
scripts/atomic/gen-atomic-long.sh
··· 49 49 local argscast_64="$(gen_args_cast "s64" "atomic64" "$@")" 50 50 local retstmt="$(gen_ret_stmt "${meta}")" 51 51 52 + gen_kerneldoc "raw_" "${meta}" "${pfx}" "${name}" "${sfx}" "${order}" "atomic_long" "long" "$@" 53 + 52 54 cat <<EOF 53 55 static __always_inline ${ret} 54 56 raw_atomic_long_${atomicname}(${params})
+13
scripts/atomic/kerneldoc/add
··· 1 + cat <<EOF 2 + /** 3 + * ${class}${atomicname}() - atomic add with ${desc_order} ordering 4 + * @i: ${int} value to add 5 + * @v: pointer to ${atomic}_t 6 + * 7 + * Atomically updates @v to (@v + @i) with ${desc_order} ordering. 8 + * 9 + * ${desc_noinstr} 10 + * 11 + * ${desc_return} 12 + */ 13 + EOF
+13
scripts/atomic/kerneldoc/add_negative
··· 1 + cat <<EOF 2 + /** 3 + * ${class}${atomicname}() - atomic add and test if negative with ${desc_order} ordering 4 + * @i: ${int} value to add 5 + * @v: pointer to ${atomic}_t 6 + * 7 + * Atomically updates @v to (@v + @i) with ${desc_order} ordering. 8 + * 9 + * ${desc_noinstr} 10 + * 11 + * Return: @true if the resulting value of @v is negative, @false otherwise. 12 + */ 13 + EOF
+18
scripts/atomic/kerneldoc/add_unless
··· 1 + if [ -z "${pfx}" ]; then 2 + desc_return="Return: @true if @v was updated, @false otherwise." 3 + fi 4 + 5 + cat <<EOF 6 + /** 7 + * ${class}${atomicname}() - atomic add unless value with ${desc_order} ordering 8 + * @v: pointer to ${atomic}_t 9 + * @a: ${int} value to add 10 + * @u: ${int} value to compare with 11 + * 12 + * If (@v != @u), atomically updates @v to (@v + @a) with ${desc_order} ordering. 13 + * 14 + * ${desc_noinstr} 15 + * 16 + * ${desc_return} 17 + */ 18 + EOF
+13
scripts/atomic/kerneldoc/and
··· 1 + cat <<EOF 2 + /** 3 + * ${class}${atomicname}() - atomic bitwise AND with ${desc_order} ordering 4 + * @i: ${int} value 5 + * @v: pointer to ${atomic}_t 6 + * 7 + * Atomically updates @v to (@v & @i) with ${desc_order} ordering. 8 + * 9 + * ${desc_noinstr} 10 + * 11 + * ${desc_return} 12 + */ 13 + EOF
+13
scripts/atomic/kerneldoc/andnot
··· 1 + cat <<EOF 2 + /** 3 + * ${class}${atomicname}() - atomic bitwise AND NOT with ${desc_order} ordering 4 + * @i: ${int} value 5 + * @v: pointer to ${atomic}_t 6 + * 7 + * Atomically updates @v to (@v & ~@i) with ${desc_order} ordering. 8 + * 9 + * ${desc_noinstr} 10 + * 11 + * ${desc_return} 12 + */ 13 + EOF
+14
scripts/atomic/kerneldoc/cmpxchg
··· 1 + cat <<EOF 2 + /** 3 + * ${class}${atomicname}() - atomic compare and exchange with ${desc_order} ordering 4 + * @v: pointer to ${atomic}_t 5 + * @old: ${int} value to compare with 6 + * @new: ${int} value to assign 7 + * 8 + * If (@v == @old), atomically updates @v to @new with ${desc_order} ordering. 9 + * 10 + * ${desc_noinstr} 11 + * 12 + * Return: The original value of @v. 13 + */ 14 + EOF
+12
scripts/atomic/kerneldoc/dec
··· 1 + cat <<EOF 2 + /** 3 + * ${class}${atomicname}() - atomic decrement with ${desc_order} ordering 4 + * @v: pointer to ${atomic}_t 5 + * 6 + * Atomically updates @v to (@v - 1) with ${desc_order} ordering. 7 + * 8 + * ${desc_noinstr} 9 + * 10 + * ${desc_return} 11 + */ 12 + EOF
+12
scripts/atomic/kerneldoc/dec_and_test
··· 1 + cat <<EOF 2 + /** 3 + * ${class}${atomicname}() - atomic decrement and test if zero with ${desc_order} ordering 4 + * @v: pointer to ${atomic}_t 5 + * 6 + * Atomically updates @v to (@v - 1) with ${desc_order} ordering. 7 + * 8 + * ${desc_noinstr} 9 + * 10 + * Return: @true if the resulting value of @v is zero, @false otherwise. 11 + */ 12 + EOF
+12
scripts/atomic/kerneldoc/dec_if_positive
··· 1 + cat <<EOF 2 + /** 3 + * ${class}${atomicname}() - atomic decrement if positive with ${desc_order} ordering 4 + * @v: pointer to ${atomic}_t 5 + * 6 + * If (@v > 0), atomically updates @v to (@v - 1) with ${desc_order} ordering. 7 + * 8 + * ${desc_noinstr} 9 + * 10 + * Return: @true if @v was updated, @false otherwise. 11 + */ 12 + EOF
+12
scripts/atomic/kerneldoc/dec_unless_positive
··· 1 + cat <<EOF 2 + /** 3 + * ${class}${atomicname}() - atomic decrement unless positive with ${desc_order} ordering 4 + * @v: pointer to ${atomic}_t 5 + * 6 + * If (@v <= 0), atomically updates @v to (@v - 1) with ${desc_order} ordering. 7 + * 8 + * ${desc_noinstr} 9 + * 10 + * Return: @true if @v was updated, @false otherwise. 11 + */ 12 + EOF
+12
scripts/atomic/kerneldoc/inc
··· 1 + cat <<EOF 2 + /** 3 + * ${class}${atomicname}() - atomic increment with ${desc_order} ordering 4 + * @v: pointer to ${atomic}_t 5 + * 6 + * Atomically updates @v to (@v + 1) with ${desc_order} ordering. 7 + * 8 + * ${desc_noinstr} 9 + * 10 + * ${desc_return} 11 + */ 12 + EOF
+12
scripts/atomic/kerneldoc/inc_and_test
··· 1 + cat <<EOF 2 + /** 3 + * ${class}${atomicname}() - atomic increment and test if zero with ${desc_order} ordering 4 + * @v: pointer to ${atomic}_t 5 + * 6 + * Atomically updates @v to (@v + 1) with ${desc_order} ordering. 7 + * 8 + * ${desc_noinstr} 9 + * 10 + * Return: @true if the resulting value of @v is zero, @false otherwise. 11 + */ 12 + EOF
+12
scripts/atomic/kerneldoc/inc_not_zero
··· 1 + cat <<EOF 2 + /** 3 + * ${class}${atomicname}() - atomic increment unless zero with ${desc_order} ordering 4 + * @v: pointer to ${atomic}_t 5 + * 6 + * If (@v != 0), atomically updates @v to (@v + 1) with ${desc_order} ordering. 7 + * 8 + * ${desc_noinstr} 9 + * 10 + * Return: @true if @v was updated, @false otherwise. 11 + */ 12 + EOF
+12
scripts/atomic/kerneldoc/inc_unless_negative
··· 1 + cat <<EOF 2 + /** 3 + * ${class}${atomicname}() - atomic increment unless negative with ${desc_order} ordering 4 + * @v: pointer to ${atomic}_t 5 + * 6 + * If (@v >= 0), atomically updates @v to (@v + 1) with ${desc_order} ordering. 7 + * 8 + * ${desc_noinstr} 9 + * 10 + * Return: @true if @v was updated, @false otherwise. 11 + */ 12 + EOF
+13
scripts/atomic/kerneldoc/or
··· 1 + cat <<EOF 2 + /** 3 + * ${class}${atomicname}() - atomic bitwise OR with ${desc_order} ordering 4 + * @i: ${int} value 5 + * @v: pointer to ${atomic}_t 6 + * 7 + * Atomically updates @v to (@v | @i) with ${desc_order} ordering. 8 + * 9 + * ${desc_noinstr} 10 + * 11 + * ${desc_return} 12 + */ 13 + EOF
+12
scripts/atomic/kerneldoc/read
··· 1 + cat <<EOF 2 + /** 3 + * ${class}${atomicname}() - atomic load with ${desc_order} ordering 4 + * @v: pointer to ${atomic}_t 5 + * 6 + * Atomically loads the value of @v with ${desc_order} ordering. 7 + * 8 + * ${desc_noinstr} 9 + * 10 + * Return: The value loaded from @v. 11 + */ 12 + EOF
+13
scripts/atomic/kerneldoc/set
··· 1 + cat <<EOF 2 + /** 3 + * ${class}${atomicname}() - atomic set with ${desc_order} ordering 4 + * @v: pointer to ${atomic}_t 5 + * @i: ${int} value to assign 6 + * 7 + * Atomically sets @v to @i with ${desc_order} ordering. 8 + * 9 + * ${desc_noinstr} 10 + * 11 + * Return: Nothing. 12 + */ 13 + EOF
+13
scripts/atomic/kerneldoc/sub
··· 1 + cat <<EOF 2 + /** 3 + * ${class}${atomicname}() - atomic subtract with ${desc_order} ordering 4 + * @i: ${int} value to subtract 5 + * @v: pointer to ${atomic}_t 6 + * 7 + * Atomically updates @v to (@v - @i) with ${desc_order} ordering. 8 + * 9 + * ${desc_noinstr} 10 + * 11 + * ${desc_return} 12 + */ 13 + EOF
+13
scripts/atomic/kerneldoc/sub_and_test
··· 1 + cat <<EOF 2 + /** 3 + * ${class}${atomicname}() - atomic subtract and test if zero with ${desc_order} ordering 4 + * @i: ${int} value to add 5 + * @v: pointer to ${atomic}_t 6 + * 7 + * Atomically updates @v to (@v - @i) with ${desc_order} ordering. 8 + * 9 + * ${desc_noinstr} 10 + * 11 + * Return: @true if the resulting value of @v is zero, @false otherwise. 12 + */ 13 + EOF
+15
scripts/atomic/kerneldoc/try_cmpxchg
··· 1 + cat <<EOF 2 + /** 3 + * ${class}${atomicname}() - atomic compare and exchange with ${desc_order} ordering 4 + * @v: pointer to ${atomic}_t 5 + * @old: pointer to ${int} value to compare with 6 + * @new: ${int} value to assign 7 + * 8 + * If (@v == @old), atomically updates @v to @new with ${desc_order} ordering. 9 + * Otherwise, updates @old to the current value of @v. 10 + * 11 + * ${desc_noinstr} 12 + * 13 + * Return: @true if the exchange occured, @false otherwise. 14 + */ 15 + EOF
+13
scripts/atomic/kerneldoc/xchg
··· 1 + cat <<EOF 2 + /** 3 + * ${class}${atomicname}() - atomic exchange with ${desc_order} ordering 4 + * @v: pointer to ${atomic}_t 5 + * @new: ${int} value to assign 6 + * 7 + * Atomically updates @v to @new with ${desc_order} ordering. 8 + * 9 + * ${desc_noinstr} 10 + * 11 + * Return: The original value of @v. 12 + */ 13 + EOF
+13
scripts/atomic/kerneldoc/xor
··· 1 + cat <<EOF 2 + /** 3 + * ${class}${atomicname}() - atomic bitwise XOR with ${desc_order} ordering 4 + * @i: ${int} value 5 + * @v: pointer to ${atomic}_t 6 + * 7 + * Atomically updates @v to (@v ^ @i) with ${desc_order} ordering. 8 + * 9 + * ${desc_noinstr} 10 + * 11 + * ${desc_return} 12 + */ 13 + EOF