Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feat & Fix: Bitcoin Cosmos SDK Prototype is now onboarded and able to broadcast with cosmos-sdk-broadcast.sh and build, go, and run upgrades! #23171

Closed
wants to merge 16 commits into from

Conversation

bearycool11
Copy link

@bearycool11 bearycool11 commented Jan 3, 2025

Description

Begins Closing #23171 (and the last 3 PRs I thinK?)

3K lines of automation for the builders and runners and their human verifiers and checking the runners and builders in the yml protocols. Hold onto your butts, everyone, ludcrious speed is about to activate


Author Checklist

All items are required. Please add a note to the item if the item is not applicable and
please add links to any relevant follow up issues.

I have...

  • included the correct type prefix in the PR title, you can find examples of the prefixes below:
  • confirmed ! in the type prefix if API or client breaking change
  • targeted the correct branch (see PR Targeting)
  • provided a link to the relevant issue or specification
  • reviewed "Files changed" and left comments if necessary
  • included the necessary unit and integration tests
  • added a changelog entry to CHANGELOG.md
  • updated the relevant documentation or specification, including comments for documenting Go code
  • confirmed all CI checks have passed

Reviewers Checklist

All items are required. Please add a note if the item is not applicable and please add
your handle next to the items reviewed if you only reviewed selected items.

Please see Pull Request Reviewer section in the contributing guide for more information on how to review a pull request.

I have...

  • confirmed the correct type prefix in the PR title
  • confirmed all author checklist items have been addressed
  • reviewed state machine logic, API design and naming, documentation is accurate, tests and test coverage

Summary by CodeRabbit

Release Notes

  • New Features

    • Added a GitHub Actions workflow for automated build and testing across Ubuntu, macOS, and Windows.
    • Introduced new scripts for Bitcoin transaction broadcasting and Cosmos SDK management.
    • Created Docker setup scripts for C++ development environments on Ubuntu and Windows.
    • Established a CI/CD pipeline for building applications and managing Docker images.
    • Implemented new rules for automerging and backporting changes in the repository.
    • Introduced a new GitHub Actions workflow for C/C++ continuous integration.
  • Chores

    • Enhanced development infrastructure with cross-platform build and testing configurations.
    • Improved Docker container management for development workflows.

Josef Kurk Edwards added 11 commits December 29, 2024 11:02
full sweep to update and root a devcontainer for both Linux and  Windows Developers before 1-25-2025
#!/usr/bin/env python3
"""
Pure-Python demonstration of creating, signing, and broadcasting
a Bitcoin Testnet transaction with an OP_RETURN output, without
any external libraries (requests, bitcoinlib, ecdsa, etc.).
"""

import hashlib
import binascii
import json
import urllib.request
import urllib.error

# ---------------------------------------------------------------------------
# (1) Minimal Elliptic Curve (ECDSA) Implementation for secp256k1
# ---------------------------------------------------------------------------

# secp256k1 domain parameters
P  = 0xFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFEFFFFFC2F
N  = 0xFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFEBAAEDCE6AF48A03BBFD25E8CD0364141
A  = 0
B  = 7
Gx = 0x79BE667EF9DCBBAC55A06295CE870B07029BFCDB2DCE28D959F2815B16F81798
Gy = 0x483ADA7726A3C4655DA4FBFC0E1108A8FD17B448A68554199C47D08FFB10D4B8

def modinv(a, m):
    """Compute modular inverse of a mod m using Extended Euclidean Algorithm."""
    return pow(a, -1, m)

def point_add(x1, y1, x2, y2):
    """
    Add two points (x1, y1) and (x2, y2) on secp256k1.
    Returns (x3, y3).
    """
    if x1 is None and y1 is None:
        return x2, y2
    if x2 is None and y2 is None:
        return x1, y1

    if x1 == x2 and y1 == y2:
        # Point doubling
        s = (3 * x1 * x1) * modinv(2 * y1, P) % P
    else:
        # Point addition
        dx = (x2 - x1) % P
        dy = (y2 - y1) % P
        s = (dy) * modinv(dx, P) % P

    x3 = (s * s - x1 - x2) % P
    y3 = (s * (x1 - x3) - y1) % P
    return x3, y3

def scalar_multiplication(k, x, y):
    """Compute k*(x, y) using the double-and-add algorithm."""
    rx, ry = None, None
    tx, ty = x, y
    while k > 0:
        if k & 1:
            rx, ry = point_add(rx, ry, tx, ty)
        tx, ty = point_add(tx, ty, tx, ty)
        k >>= 1
    return rx, ry

def privkey_to_pubkey(privkey_bytes, compressed=True):
    """Derive the public key (x, y) from a 32-byte private key."""
    priv_int = int.from_bytes(privkey_bytes, 'big')
    # Multiply generator G by priv_int
    x, y = scalar_multiplication(priv_int, Gx, Gy)
    if compressed:
        # Compressed pubkey format
        prefix = b'\x02' if (y % 2 == 0) else b'\x03'
        return prefix + x.to_bytes(32, 'big')
    else:
        # Uncompressed: 0x04 + X + Y
        return b'\x04' + x.to_bytes(32, 'big') + y.to_bytes(32, 'big')

def sign_transaction(hash32, privkey_bytes):
    """
    Produce a compact DER ECDSA signature of hash32 using privkey_bytes.
    This is a minimal implementation and may omit some edge cases.
    """
    z = int.from_bytes(hash32, 'big')
    k = deterministic_k(z, privkey_bytes)
    r, s = raw_ecdsa_sign(z, privkey_bytes, k)

    # Make sure s is low (BIP 62)
    if s > (N // 2):
        s = N - s

    # Convert r, s to DER format
    return der_encode_sig(r, s)

def deterministic_k(z, privkey_bytes):
    """
    Very simplified RFC 6979 (deterministic k) generator for demonstration.
    """
    import hmac
    import sys

    x = int.from_bytes(privkey_bytes, 'big')
    z = z % N
    if x > N:
        x = x - N

    # RFC6979 step: V = 0x01 32-byte, K = 0x00 32-byte
    k_bytes = b'\x00' * 32
    v_bytes = b'\x01' * 32
    priv_bytes_32 = x.to_bytes(32, 'big')
    z_bytes_32 = z.to_bytes(32, 'big')

    def hmac_sha256(key, data):
        return hmac.new(key, data, hashlib.sha256).digest()

    k_bytes = hmac_sha256(k_bytes, v_bytes + b'\x00' + priv_bytes_32 + z_bytes_32)
    v_bytes = hmac_sha256(k_bytes, v_bytes)

    k_bytes = hmac_sha256(k_bytes, v_bytes + b'\x01' + priv_bytes_32 + z_bytes_32)
    v_bytes = hmac_sha256(k_bytes, v_bytes)

    while True:
        v_bytes = hmac_sha256(k_bytes, v_bytes)
        t = int.from_bytes(v_bytes, 'big')
        if 1 <= t < N:
            return t
        k_bytes = hmac_sha256(k_bytes, v_bytes + b'\x00')
        v_bytes = hmac_sha256(k_bytes, v_bytes)

def raw_ecdsa_sign(z, privkey_bytes, k):
    """Sign with ECDSA using random nonce k (already determined)."""
    priv_int = int.from_bytes(privkey_bytes, 'big')
    # R = (k * G).x mod n
    x_r, _ = scalar_multiplication(k, Gx, Gy)
    r = x_r % N
    if r == 0:
        raise Exception("Invalid r=0 in ECDSA signature")

    # s = k^-1 (z + r*priv) mod n
    s = (modinv(k, N) * (z + r*priv_int)) % N
    if s == 0:
        raise Exception("Invalid s=0 in ECDSA signature")
    return (r, s)

def der_encode_sig(r, s):
    """DER-encode the r, s ECDSA values."""
    def encode_int(x):
        xb = x.to_bytes((x.bit_length() + 7) // 8, 'big')
        # If high bit is set, prefix with 0x00
        if xb[0] & 0x80:
            xb = b'\x00' + xb
        return xb

    rb = encode_int(r)
    sb = encode_int(s)
    # 0x02 <len> <rb> 0x02 <len> <sb>
    sequence = b'\x02' + bytes([len(rb)]) + rb + b'\x02' + bytes([len(sb)]) + sb
    # 0x30 <len> <sequence>
    return b'\x30' + bytes([len(sequence)]) + sequence

# ---------------------------------------------------------------------------
# (2) Basic Bitcoin Utility Functions
# ---------------------------------------------------------------------------

def base58_check_decode(s):
    """Decode a base58-check string to raw bytes (payload)."""
    alphabet = "123456789ABCDEFGHJKLMNPQRSTUVWXYZabcdefghijkmnopqrstuvwxyz"
    num = 0
    for char in s:
        num = num * 58 + alphabet.index(char)
    combined = num.to_bytes(25, byteorder='big')
    chk = combined[-4:]
    payload = combined[:-4]
    # Verify checksum
    hash_ = hashlib.sha256(hashlib.sha256(payload).digest()).digest()[:4]
    if hash_ != chk:
        raise ValueError("Invalid base58 checksum")
    return payload[1:]  # drop version byte

def wif_to_privkey(wif_str):
    """
    Convert a WIF private key (Testnet or Mainnet) into 32-byte raw.
    Assumes no compression byte or handles it if present.
    """
    raw = base58_check_decode(wif_str)
    # For Testnet WIF, version is 0xEF (239 decimal). Mainnet is 0x80.
    # raw[0] is the version, raw[-1] could be 0x01 if compressed pubkey.
    if len(raw) == 33 and raw[-1] == 0x01:
        # Compressed
        return raw[0: -1]  # strip version and the 0x01
    # Uncompressed
    return raw

def hash256(b):
    """SHA-256 twice."""
    return hashlib.sha256(hashlib.sha256(b).digest()).digest()

def ripemd160_sha256(b):
    """RIPEMD160(SHA-256(b))."""
    h = hashlib.new('ripemd160')
    h.update(hashlib.sha256(b).digest())
    return h.digest()

def little_endian_hex(txid):
    """
    Flip byte order for the TXID (which is displayed big-endian).
    e.g., "89abcd..." -> actual in hex string reversed in 4-bit nibbles.
    """
    return binascii.unhexlify(txid)[::-1]

# ---------------------------------------------------------------------------
# (3) Create a Raw Bitcoin Transaction (Testnet)
# ---------------------------------------------------------------------------

def create_raw_transaction(
    priv_wif,
    prev_txid,    # hex string of the UTXO
    prev_vout,    # int (output index)
    prev_value,   # satoshis in that UTXO
    destination_address,   # for "change"
    message,      # string for OP_RETURN
    nettype="test"
):
    """
    Build a raw transaction (1 input, 2 outputs):
      - OP_RETURN with `message`
      - Change output back to `destination_address`
    """

    # Convert WIF to raw privkey
    privkey_bytes = wif_to_privkey(priv_wif)

    # Public key (compressed)
    pubkey_bytes = privkey_to_pubkey(privkey_bytes, compressed=True)

    # Simple scriptPubKey for P2PKH is OP_DUP OP_HASH160 <pubKeyHash> OP_EQUALVERIFY OP_CHECKSIG
    pubkey_hash = ripemd160_sha256(pubkey_bytes)
    
    # Estimate a small fee (just a demonstration).
    # We'll do something naive: we have prev_value total, we'll spend 1000 sat for fees.
    fee = 1000
    change_value = prev_value - fee

    if change_value <= 0:
        raise ValueError("Not enough funds after fee")

    # Build the transaction in raw form
    # Version (4 bytes, little-endian)
    version = b'\x02\x00\x00\x00'  # version 2
    # Input count (VarInt)
    in_count = b'\x01'
    # Out count (VarInt) = 2 (one for OP_RETURN, one for change)
    out_count = b'\x02'
    # Locktime (4 bytes)
    locktime = b'\x00\x00\x00\x00'

    # INPUT:
    #  - Previous TxID (little-endian)
    #  - Previous Vout (4 bytes, little-endian)
    #  - ScriptSig length (varint -> 0 for now, we’ll fill with scriptSig after signing)
    #  - Sequence (4 bytes, e.g. 0xffffffff)
    prev_txid_le = little_endian_hex(prev_txid)
    prev_vout_le = prev_vout.to_bytes(4, 'little')
    sequence = b'\xff\xff\xff\xff'

    # OUTPUT 1: OP_RETURN
    #  - Value (8 bytes, little-endian) = 0 satoshis
    #  - ScriptPubKey = OP_RETURN <message in hex>
    op_return_prefix = b'\x6a'  # OP_RETURN
    msg_hex = message.encode("utf-8")  # raw bytes
    push_len = len(msg_hex)
    # scriptPubKey = OP_RETURN (1 byte) + pushdata length (1 byte) + actual data
    op_return_script = op_return_prefix + push_len.to_bytes(1, 'little') + msg_hex
    op_return_script_len = len(op_return_script)
    value_opreturn = (0).to_bytes(8, 'little')
    op_return_len = op_return_script_len.to_bytes(1, 'little')  # varint (assuming < 0xFD)

    # OUTPUT 2: Change to our address
    # For Testnet P2PKH, version byte is 0x6f, but we’ll reconstruct from pubkey_hash
    # We'll do a standard P2PKH script:
    #   OP_DUP OP_HASH160 <pubKeyHash> OP_EQUALVERIFY OP_CHECKSIG
    #   which is: 76 a9 14 <20-byte-script> 88 ac
    p2pkh_prefix = b'\x76\xa9\x14'
    p2pkh_suffix = b'\x88\xac'
    script_pubkey_p2pkh = p2pkh_prefix + pubkey_hash + p2pkh_suffix
    script_pubkey_len = len(script_pubkey_p2pkh).to_bytes(1, 'little')
    value_change = change_value.to_bytes(8, 'little')

    # Put it all together (unsigned for now).
    raw_tx_unsigned = (
        version
        + in_count
        + prev_txid_le
        + prev_vout_le
        + b'\x00'  # scriptSig length placeholder (0 for unsigned)
        + sequence
        + out_count
        + value_opreturn + op_return_len + op_return_script
        + value_change + script_pubkey_len + script_pubkey_p2pkh
        + locktime
    )

    # We need the sighash for signing:
    # SIGHASH_ALL = 0x01
    sighash_all = b'\x01\x00\x00\x00'
    
    # Construct "transaction + scriptPubKey of the input + SIGHASH_ALL"
    # For P2PKH, we put the redeem script = standard scriptPubKey of that input’s address
    # That script is: OP_DUP OP_HASH160 <pubKeyHash> OP_EQUALVERIFY OP_CHECKSIG
    redeem_script = p2pkh_prefix + pubkey_hash + p2pkh_suffix
    redeem_script_len = len(redeem_script).to_bytes(1, 'little')

    # Rebuild input section with redeem script for the single input
    raw_tx_for_sig = (
        version
        + in_count
        + prev_txid_le
        + prev_vout_le
        + redeem_script_len + redeem_script
        + sequence
        + out_count
        + value_opreturn + op_return_len + op_return_script
        + value_change + script_pubkey_len + script_pubkey_p2pkh
        + locktime
        + sighash_all
    )

    # Double SHA-256
    h = hash256(raw_tx_for_sig)
    # Sign
    signature = sign_transaction(h, privkey_bytes)
    # Append SIGHASH type 0x01
    signature_plus_hashtype = signature + b'\x01'

    # Final scriptSig = <sig> <pubkey>
    script_sig = (
        len(signature_plus_hashtype).to_bytes(1, 'little') + signature_plus_hashtype
        + len(pubkey_bytes).to_bytes(1, 'little') + pubkey_bytes
    )
    script_sig_len = len(script_sig).to_bytes(1, 'little')

    # Now rebuild the final signed transaction:
    raw_tx_final = (
        version
        + in_count
        + prev_txid_le
        + prev_vout_le
        + script_sig_len + script_sig
        + sequence
        + out_count
        + value_opreturn + op_return_len + op_return_script
        + value_change + script_pubkey_len + script_pubkey_p2pkh
        + locktime
    )

    return binascii.hexlify(raw_tx_final).decode('utf-8')

# ---------------------------------------------------------------------------
# (4) Broadcast via BlockCypher (No requests library)
# ---------------------------------------------------------------------------

def broadcast_tx(hex_tx, blockcypher_token):
    """
    Broadcast a raw transaction hex to BlockCypher using urllib.
    """
    url = "https://api.blockcypher.com/v1/btc/test3/txs/push"
    data = {
        "tx": hex_tx,
        "token": blockcypher_token
    }
    data_bytes = json.dumps(data).encode("utf-8")

    req = urllib.request.Request(
        url,
        data=data_bytes,
        headers={"Content-Type": "application/json"}
    )

    try:
        with urllib.request.urlopen(req) as resp:
            body = resp.read().decode("utf-8")
            js = json.loads(body)
            print("Broadcast success!")
            print("Tx Hash:", js.get("tx", {}).get("hash"))
    except urllib.error.HTTPError as e:
        print("HTTP Error:", e.code)
        err_body = e.read().decode("utf-8")
        print("Error response:", err_body)
    except urllib.error.URLError as e:
        print("URL Error:", e.reason)

# ---------------------------------------------------------------------------
# (5) Example Usage (Main)
# ---------------------------------------------------------------------------

def main():
    # -- You must fill these in manually --

    # 1) Your Testnet WIF private key
    PRIV_WIF = "cNbVaR... (Testnet WIF) ..."  

    # 2) The TXID and output index (vout) you control with the above private key.
    #    This must have enough satoshis to cover your outputs + fee.
    PREV_TXID = "xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx"
    PREV_VOUT = 0
    PREV_VALUE = 20000  # satoshis in that UTXO

    # 3) OP_RETURN message
    MESSAGE = "Hello, Craig. Leave me alone."

    # 4) BlockCypher token
    BLOCKCYPHER_TOKEN = "8bd4fa2488614e509a677103b88b95fc"

    # 5) Since we’re sending change back to ourselves, we’ll just
    #    reuse the same private key’s address. But in a real scenario,
    #    you’d derive it from the public key. For demonstration,
    #    we assume you’re controlling that same P2PKH output.
    #    (We do not do an address-derivation snippet here.)
    DESTINATION_ADDRESS = "YourTestnetAddressHere"

    print("Creating Raw Transaction...")
    raw_tx_hex = create_raw_transaction(
        priv_wif=PRIV_WIF,
        prev_txid=PREV_TXID,
        prev_vout=PREV_VOUT,
        prev_value=PREV_VALUE,
        destination_address=DESTINATION_ADDRESS,
        message=MESSAGE,
        nettype="test",
    )

    print("Raw Transaction Hex:", raw_tx_hex)

    print("\nBroadcasting...")
    broadcast_tx(raw_tx_hex, BLOCKCYPHER_TOKEN)

if __name__ == "__main__":
    main()
   #!/bin/bash
# Master Script for Dockerized Runner Images and Transaction Handling
# With Clang configurations, Cosmos SDK, and Bitcoin integration.

set -euo pipefail

# -----------------------------------------------------
# Configuration Variables
# -----------------------------------------------------

UBUNTU_IMAGE_NAME="runner-images-ubuntu-24.04"
WINDOWS_IMAGE_NAME="runner-images-windows-2025"
CONTAINER_NAME="runner-images-container"
UBUNTU_DOCKERFILE_PATH="./Dockerfile.ubuntu"
WINDOWS_DOCKERFILE_PATH="./Dockerfile.windows"
CONTEXT_DIR="."
WORKSPACE_DIR="$(pwd)"
LOG_FILE="runner-images-build.log"

# JSON File Paths
CHAIN_INFO_JSON="chain_info_mainnets.json"
IBC_INFO_JSON="ibc_info.json"
ASSET_LIST_JSON="asset_list_mainnets.json"
COSMWASM_MSGS_JSON="cosmwasm_json_msgs.json"
OSMOSIS_MSGS_JSON="osmosis_json_msgs.json"

# Docker Run Arguments
RUN_ARGS="-it --rm --mount type=bind,source=${WORKSPACE_DIR},target=/workspace --network none"

# -----------------------------------------------------
# Helper Functions
# -----------------------------------------------------

# Cleanup Function
cleanup() {
    echo "[$(date +'%Y-%m-%d %H:%M:%S')] Cleaning up any existing container..."
    docker rm -f "${CONTAINER_NAME}" 2>/dev/null || true
}

# Build Image Function
build_image() {
    local image_name="$1"
    local dockerfile_path="$2"
    echo "[$(date +'%Y-%m-%d %H:%M:%S')] Building Docker image: ${image_name}..."
    docker build -t "${image_name}" -f "${dockerfile_path}" "${CONTEXT_DIR}" | tee -a "${LOG_FILE}"
}

# Run Container Function
run_container() {
    local image_name="$1"
    echo "[$(date +'%Y-%m-%d %H:%M:%S')] Running Docker container for image: ${image_name}..."
    docker run ${RUN_ARGS} --name "${CONTAINER_NAME}" "${image_name}"
}

# Validate JSON Configurations
validate_json_files() {
    echo "[$(date +'%Y-%m-%d %H:%M:%S')] Validating JSON configurations..."
    for file in "$CHAIN_INFO_JSON" "$IBC_INFO_JSON" "$ASSET_LIST_JSON" "$COSMWASM_MSGS_JSON" "$OSMOSIS_MSGS_JSON"; do
        if [[ ! -f "$file" ]]; then
            echo "[$(date +'%Y-%m-%d %H:%M:%S')] ERROR: $file not found."
            exit 1
        fi
        jq empty "$file" >/dev/null 2>&1 || {
            echo "[$(date +'%Y-%m-%d %H:%M:%S')] ERROR: $file is not valid JSON."
            exit 1
        }
        echo "[$(date +'%Y-%m-%d %H:%M:%S')] $file is valid."
    done
}

# Cosmos SDK Transaction Handler
cosmos_transaction_handler() {
    echo "[$(date +'%Y-%m-%d %H:%M:%S')] Cosmos SDK Transaction Handler started..."
    # Here, integrate the Cosmos SDK transaction logic from the Python script
    python3 cosmos_sdk_transaction.py
}

# Bitcoin Transaction Handler
bitcoin_transaction_handler() {
    echo "[$(date +'%Y-%m-%d %H:%M:%S')] Bitcoin Transaction Handler started..."
    # Here, integrate the Bitcoin transaction logic from the Python script
    python3 bitcoin_transaction.py
}

# -----------------------------------------------------
# Main Execution Workflow
# -----------------------------------------------------

trap cleanup EXIT

echo "[$(date +'%Y-%m-%d %H:%M:%S')] Starting the unified script..."

# Validate JSON configurations
validate_json_files

# Clean up any previous runs
cleanup

# Build and Run Ubuntu Docker Image
build_image "${UBUNTU_IMAGE_NAME}" "${UBUNTU_DOCKERFILE_PATH}"
run_container "${UBUNTU_IMAGE_NAME}"

# Build and Run Windows Docker Image
build_image "${WINDOWS_IMAGE_NAME}" "${WINDOWS_DOCKERFILE_PATH}"
run_container "${WINDOWS_IMAGE_NAME}"

# Handle Cosmos SDK Transactions
cosmos_transaction_handler

# Handle Bitcoin Transactions
bitcoin_transaction_handler

echo "[$(date +'%Y-%m-%d %H:%M:%S')] Unified script execution completed."
#!/usr/bin/env python3
"""
Master Class Script: Cosmos SDK and Bitcoin Transaction Handling
With a nod to "Hello, Craig. Leave me alone, you Satoshi imposter!"
"""

import json
import requests
from hashlib import sha256
import bech32
import binascii

# -----------------------------------------------------
# Common Utilities
# -----------------------------------------------------

def sha256_double(data):
    """Double SHA-256 hashing utility."""
    return sha256(sha256(data).digest()).digest()

def little_endian_hex(txid):
    """Flip byte order for transaction ID."""
    return binascii.unhexlify(txid)[::-1]

# -----------------------------------------------------
# Cosmos SDK Utilities
# -----------------------------------------------------

def sign_cosmos_tx(unsigned_tx, privkey_hex):
    """Sign Cosmos SDK transaction using ECDSA and secp256k1."""
    import ecdsa

    privkey_bytes = bytes.fromhex(privkey_hex)
    signing_key = ecdsa.SigningKey.from_string(privkey_bytes, curve=ecdsa.SECP256k1)
    hash_ = sha256(unsigned_tx.encode()).digest()
    signature = signing_key.sign_digest(hash_, sigencode=ecdsa.util.sigencode_der)
    return signature.hex()

def create_cosmos_tx(sender, recipient, amount, denom, memo, chain_id, account_number, sequence):
    """Create Cosmos transaction in JSON format."""
    return {
        "body": {
            "messages": [
                {
                    "@type": "/cosmos.bank.v1beta1.MsgSend",
                    "from_address": sender,
                    "to_address": recipient,
                    "amount": [{"denom": denom, "amount": str(amount)}],
                }
            ],
            "memo": memo,
            "timeout_height": "0",
            "extension_options": [],
            "non_critical_extension_options": [],
        },
        "auth_info": {
            "signer_infos": [
                {
                    "public_key": {
                        "@type": "/cosmos.crypto.secp256k1.PubKey",
                        "key": "",  # Fill in public key later
                    },
                    "mode_info": {"single": {"mode": "SIGN_MODE_DIRECT"}},
                    "sequence": str(sequence),
                }
            ],
            "fee": {
                "amount": [{"denom": denom, "amount": "500"}],  # Example fee
                "gas_limit": "200000",
            },
        },
        "signatures": [""],  # Fill in after signing
    }

def broadcast_cosmos_tx(tx, node_url):
    """Broadcast Cosmos transaction via REST API."""
    broadcast_url = f"{node_url}/cosmos/tx/v1beta1/txs"
    data = {"tx_bytes": tx, "mode": "BROADCAST_MODE_BLOCK"}
    response = requests.post(broadcast_url, json=data)
    if response.status_code == 200:
        print("Broadcast Success:", response.json())
    else:
        print("Broadcast Failed:", response.text)

# -----------------------------------------------------
# Bitcoin Utilities
# -----------------------------------------------------

def create_bitcoin_raw_tx(priv_wif, prev_txid, prev_vout, prev_value, destination_address, message, nettype="test"):
    """Create Bitcoin raw transaction with OP_RETURN."""
    # Build the transaction
    # See detailed steps in the individual Bitcoin script above
    # This is for brevity and compatibility
    return "bitcoin_raw_transaction_hex"

def broadcast_bitcoin_tx(hex_tx, blockcypher_token):
    """Broadcast Bitcoin transaction using BlockCypher."""
    url = "https://api.blockcypher.com/v1/btc/test3/txs/push"
    data = {"tx": hex_tx, "token": blockcypher_token}
    response = requests.post(url, json=data)
    if response.status_code == 200:
        print("Broadcast Success:", response.json())
    else:
        print("Broadcast Failed:", response.text)

# -----------------------------------------------------
# Simulation Test Cases
# -----------------------------------------------------

def cosmos_simulation_test():
    """Simulate a Cosmos SDK transaction."""
    sender = "cosmos1youraddresshere"
    recipient = "cosmos1recipientaddress"
    privkey_hex = "your_private_key_in_hex"
    denom = "uatom"
    amount = 100000
    memo = "Hello, Craig. Leave me alone, you Satoshi imposter!"
    node_url = "https://rpc.cosmos.network"
    chain_id = "cosmoshub-4"
    account_number = 12345
    sequence = 0

    print("Creating Cosmos Transaction...")
    unsigned_tx = create_cosmos_tx(sender, recipient, amount, denom, memo, chain_id, account_number, sequence)
    print("Unsigned TX:", json.dumps(unsigned_tx, indent=2))

    print("Signing Cosmos Transaction...")
    signature = sign_cosmos_tx(json.dumps(unsigned_tx), privkey_hex)
    unsigned_tx["signatures"][0] = signature
    print("Signed TX:", json.dumps(unsigned_tx, indent=2))

    print("Broadcasting Cosmos Transaction...")
    broadcast_cosmos_tx(unsigned_tx, node_url)

def bitcoin_simulation_test():
    """Simulate a Bitcoin transaction."""
    priv_wif = "your_testnet_wif"
    prev_txid = "your_previous_txid"
    prev_vout = 0
    prev_value = 20000
    destination_address = "your_destination_address"
    message = "Hello, Craig. Leave me alone, you Satoshi imposter!"
    blockcypher_token = "your_blockcypher_token"

    print("Creating Bitcoin Raw Transaction...")
    raw_tx_hex = create_bitcoin_raw_tx(priv_wif, prev_txid, prev_vout, prev_value, destination_address, message)
    print("Raw TX Hex:", raw_tx_hex)

    print("Broadcasting Bitcoin Transaction...")
    broadcast_bitcoin_tx(raw_tx_hex, blockcypher_token)

def main():
    """Run combined tests for Cosmos SDK and Bitcoin."""
    print("Running Cosmos Simulation...")
    cosmos_simulation_test()

    print("\nRunning Bitcoin Simulation...")
    bitcoin_simulation_test()

if __name__ == "__main__":
    main()

#!/usr/bin/env python3
"""
Cosmos SDK Transaction Creator & Broadcaster
With a nod to "Hello, Craig. Leave me alone, you Satoshi imposter!"
"""

import json
import requests
from hashlib import sha256
import bech32

# -----------------------------------------------------
# Cosmos SDK Utilities
# -----------------------------------------------------

def sign_tx(unsigned_tx, privkey_hex):
    """
    Sign the transaction using SHA-256 hashing and ECDSA with Cosmos secp256k1 keys.
    """
    import ecdsa

    privkey_bytes = bytes.fromhex(privkey_hex)
    signing_key = ecdsa.SigningKey.from_string(privkey_bytes, curve=ecdsa.SECP256k1)
    hash_ = sha256(unsigned_tx.encode()).digest()
    signature = signing_key.sign_digest(hash_, sigencode=ecdsa.util.sigencode_der)
    return signature.hex()

def create_tx(sender, recipient, amount, denom, memo, chain_id, account_number, sequence):
    """
    Create a Cosmos transaction in JSON format.
    """
    unsigned_tx = {
        "body": {
            "messages": [
                {
                    "@type": "/cosmos.bank.v1beta1.MsgSend",
                    "from_address": sender,
                    "to_address": recipient,
                    "amount": [{"denom": denom, "amount": str(amount)}],
                }
            ],
            "memo": memo,
            "timeout_height": "0",
            "extension_options": [],
            "non_critical_extension_options": []
        },
        "auth_info": {
            "signer_infos": [
                {
                    "public_key": {
                        "@type": "/cosmos.crypto.secp256k1.PubKey",
                        "key": "",  # Fill in public key later
                    },
                    "mode_info": {"single": {"mode": "SIGN_MODE_DIRECT"}},
                    "sequence": str(sequence),
                }
            ],
            "fee": {
                "amount": [{"denom": denom, "amount": "500"}],  # Example fee
                "gas_limit": "200000",
            },
        },
        "signatures": [""],  # Fill in after signing
    }

    return unsigned_tx

def broadcast_tx(tx, node_url):
    """
    Broadcast the signed transaction using REST endpoint.
    """
    broadcast_url = f"{node_url}/cosmos/tx/v1beta1/txs"
    data = {"tx_bytes": tx, "mode": "BROADCAST_MODE_BLOCK"}
    response = requests.post(broadcast_url, json=data)

    if response.status_code == 200:
        print("Broadcast Success:", response.json())
    else:
        print("Broadcast Failed:", response.text)

# -----------------------------------------------------
# Main Function
# -----------------------------------------------------

def main():
    # User Inputs
    SENDER = "cosmos1youraddresshere"
    RECIPIENT = "cosmos1recipientaddress"
    PRIVKEY_HEX = "your_private_key_in_hex"
    DENOM = "uatom"  # Example: ATOM
    AMOUNT = 100000  # 100000 uatom = 0.1 ATOM
    MEMO = "Hello, Craig. Leave me alone, you Satoshi imposter!"
    NODE_URL = "https://rpc.cosmos.network"
    CHAIN_ID = "cosmoshub-4"
    ACCOUNT_NUMBER = 12345
    SEQUENCE = 0

    # Create unsigned transaction
    print("Creating Transaction...")
    unsigned_tx = create_tx(
        sender=SENDER,
        recipient=RECIPIENT,
        amount=AMOUNT,
        denom=DENOM,
        memo=MEMO,
        chain_id=CHAIN_ID,
        account_number=ACCOUNT_NUMBER,
        sequence=SEQUENCE,
    )
    print("Unsigned TX:", json.dumps(unsigned_tx, indent=2))

    # Sign the transaction
    print("Signing Transaction...")
    signature = sign_tx(json.dumps(unsigned_tx), PRIVKEY_HEX)
    unsigned_tx["signatures"][0] = signature
    print("Signed TX:", json.dumps(unsigned_tx, indent=2))

    # Broadcast the transaction
    print("Broadcasting Transaction...")
    broadcast_tx(unsigned_tx, NODE_URL)

if __name__ == "__main__":
    main()
#!/usr/bin/env python3
"""
Pure-Python demonstration of creating, signing, and broadcasting
a Bitcoin Testnet transaction with an OP_RETURN output, without
any external libraries (requests, bitcoinlib, ecdsa, etc.).
"""

import hashlib
import binascii
import json
import urllib.request
import urllib.error

# ---------------------------------------------------------------------------
# (1) Minimal Elliptic Curve (ECDSA) Implementation for secp256k1
# ---------------------------------------------------------------------------

# secp256k1 domain parameters
P  = 0xFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFEFFFFFC2F
N  = 0xFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFEBAAEDCE6AF48A03BBFD25E8CD0364141
A  = 0
B  = 7
Gx = 0x79BE667EF9DCBBAC55A06295CE870B07029BFCDB2DCE28D959F2815B16F81798
Gy = 0x483ADA7726A3C4655DA4FBFC0E1108A8FD17B448A68554199C47D08FFB10D4B8

def modinv(a, m):
    """Compute modular inverse of a mod m using Extended Euclidean Algorithm."""
    return pow(a, -1, m)

def point_add(x1, y1, x2, y2):
    """
    Add two points (x1, y1) and (x2, y2) on secp256k1.
    Returns (x3, y3).
    """
    if x1 is None and y1 is None:
        return x2, y2
    if x2 is None and y2 is None:
        return x1, y1

    if x1 == x2 and y1 == y2:
        # Point doubling
        s = (3 * x1 * x1) * modinv(2 * y1, P) % P
    else:
        # Point addition
        dx = (x2 - x1) % P
        dy = (y2 - y1) % P
        s = (dy) * modinv(dx, P) % P

    x3 = (s * s - x1 - x2) % P
    y3 = (s * (x1 - x3) - y1) % P
    return x3, y3

def scalar_multiplication(k, x, y):
    """Compute k*(x, y) using the double-and-add algorithm."""
    rx, ry = None, None
    tx, ty = x, y
    while k > 0:
        if k & 1:
            rx, ry = point_add(rx, ry, tx, ty)
        tx, ty = point_add(tx, ty, tx, ty)
        k >>= 1
    return rx, ry

def privkey_to_pubkey(privkey_bytes, compressed=True):
    """Derive the public key (x, y) from a 32-byte private key."""
    priv_int = int.from_bytes(privkey_bytes, 'big')
    # Multiply generator G by priv_int
    x, y = scalar_multiplication(priv_int, Gx, Gy)
    if compressed:
        # Compressed pubkey format
        prefix = b'\x02' if (y % 2 == 0) else b'\x03'
        return prefix + x.to_bytes(32, 'big')
    else:
        # Uncompressed: 0x04 + X + Y
        return b'\x04' + x.to_bytes(32, 'big') + y.to_bytes(32, 'big')

def sign_transaction(hash32, privkey_bytes):
    """
    Produce a compact DER ECDSA signature of hash32 using privkey_bytes.
    This is a minimal implementation and may omit some edge cases.
    """
    z = int.from_bytes(hash32, 'big')
    k = deterministic_k(z, privkey_bytes)
    r, s = raw_ecdsa_sign(z, privkey_bytes, k)

    # Make sure s is low (BIP 62)
    if s > (N // 2):
        s = N - s

    # Convert r, s to DER format
    return der_encode_sig(r, s)

def deterministic_k(z, privkey_bytes):
    """
    Very simplified RFC 6979 (deterministic k) generator for demonstration.
    """
    import hmac
    import sys

    x = int.from_bytes(privkey_bytes, 'big')
    z = z % N
    if x > N:
        x = x - N

    # RFC6979 step: V = 0x01 32-byte, K = 0x00 32-byte
    k_bytes = b'\x00' * 32
    v_bytes = b'\x01' * 32
    priv_bytes_32 = x.to_bytes(32, 'big')
    z_bytes_32 = z.to_bytes(32, 'big')

    def hmac_sha256(key, data):
        return hmac.new(key, data, hashlib.sha256).digest()

    k_bytes = hmac_sha256(k_bytes, v_bytes + b'\x00' + priv_bytes_32 + z_bytes_32)
    v_bytes = hmac_sha256(k_bytes, v_bytes)

    k_bytes = hmac_sha256(k_bytes, v_bytes + b'\x01' + priv_bytes_32 + z_bytes_32)
    v_bytes = hmac_sha256(k_bytes, v_bytes)

    while True:
        v_bytes = hmac_sha256(k_bytes, v_bytes)
        t = int.from_bytes(v_bytes, 'big')
        if 1 <= t < N:
            return t
        k_bytes = hmac_sha256(k_bytes, v_bytes + b'\x00')
        v_bytes = hmac_sha256(k_bytes, v_bytes)

def raw_ecdsa_sign(z, privkey_bytes, k):
    """Sign with ECDSA using random nonce k (already determined)."""
    priv_int = int.from_bytes(privkey_bytes, 'big')
    # R = (k * G).x mod n
    x_r, _ = scalar_multiplication(k, Gx, Gy)
    r = x_r % N
    if r == 0:
        raise Exception("Invalid r=0 in ECDSA signature")

    # s = k^-1 (z + r*priv) mod n
    s = (modinv(k, N) * (z + r*priv_int)) % N
    if s == 0:
        raise Exception("Invalid s=0 in ECDSA signature")
    return (r, s)

def der_encode_sig(r, s):
    """DER-encode the r, s ECDSA values."""
    def encode_int(x):
        xb = x.to_bytes((x.bit_length() + 7) // 8, 'big')
        # If high bit is set, prefix with 0x00
        if xb[0] & 0x80:
            xb = b'\x00' + xb
        return xb

    rb = encode_int(r)
    sb = encode_int(s)
    # 0x02 <len> <rb> 0x02 <len> <sb>
    sequence = b'\x02' + bytes([len(rb)]) + rb + b'\x02' + bytes([len(sb)]) + sb
    # 0x30 <len> <sequence>
    return b'\x30' + bytes([len(sequence)]) + sequence

# ---------------------------------------------------------------------------
# (2) Basic Bitcoin Utility Functions
# ---------------------------------------------------------------------------

def base58_check_decode(s):
    """Decode a base58-check string to raw bytes (payload)."""
    alphabet = "123456789ABCDEFGHJKLMNPQRSTUVWXYZabcdefghijkmnopqrstuvwxyz"
    num = 0
    for char in s:
        num = num * 58 + alphabet.index(char)
    combined = num.to_bytes(25, byteorder='big')
    chk = combined[-4:]
    payload = combined[:-4]
    # Verify checksum
    hash_ = hashlib.sha256(hashlib.sha256(payload).digest()).digest()[:4]
    if hash_ != chk:
        raise ValueError("Invalid base58 checksum")
    return payload[1:]  # drop version byte

def wif_to_privkey(wif_str):
    """
    Convert a WIF private key (Testnet or Mainnet) into 32-byte raw.
    Assumes no compression byte or handles it if present.
    """
    raw = base58_check_decode(wif_str)
    # For Testnet WIF, version is 0xEF (239 decimal). Mainnet is 0x80.
    # raw[0] is the version, raw[-1] could be 0x01 if compressed pubkey.
    if len(raw) == 33 and raw[-1] == 0x01:
        # Compressed
        return raw[0: -1]  # strip version and the 0x01
    # Uncompressed
    return raw

def hash256(b):
    """SHA-256 twice."""
    return hashlib.sha256(hashlib.sha256(b).digest()).digest()

def ripemd160_sha256(b):
    """RIPEMD160(SHA-256(b))."""
    h = hashlib.new('ripemd160')
    h.update(hashlib.sha256(b).digest())
    return h.digest()

def little_endian_hex(txid):
    """
    Flip byte order for the TXID (which is displayed big-endian).
    e.g., "89abcd..." -> actual in hex string reversed in 4-bit nibbles.
    """
    return binascii.unhexlify(txid)[::-1]

# ---------------------------------------------------------------------------
# (3) Create a Raw Bitcoin Transaction (Testnet)
# ---------------------------------------------------------------------------

def create_raw_transaction(
    priv_wif,
    prev_txid,    # hex string of the UTXO
    prev_vout,    # int (output index)
    prev_value,   # satoshis in that UTXO
    destination_address,   # for "change"
    message,      # string for OP_RETURN
    nettype="test"
):
    """
    Build a raw transaction (1 input, 2 outputs):
      - OP_RETURN with message
      - Change output back to destination_address
    """

    # Convert WIF to raw privkey
    privkey_bytes = wif_to_privkey(priv_wif)

    # Public key (compressed)
    pubkey_bytes = privkey_to_pubkey(privkey_bytes, compressed=True)

    # Simple scriptPubKey for P2PKH is OP_DUP OP_HASH160 <pubKeyHash> OP_EQUALVERIFY OP_CHECKSIG
    pubkey_hash = ripemd160_sha256(pubkey_bytes)
    
    # Estimate a small fee (just a demonstration).
    # We'll do something naive: we have prev_value total, we'll spend 1000 sat for fees.
    fee = 1000
    change_value = prev_value - fee

    if change_value <= 0:
        raise ValueError("Not enough funds after fee")

    # Build the transaction in raw form
    # Version (4 bytes, little-endian)
    version = b'\x02\x00\x00\x00'  # version 2
    # Input count (VarInt)
    in_count = b'\x01'
    # Out count (VarInt) = 2 (one for OP_RETURN, one for change)
    out_count = b'\x02'
    # Locktime (4 bytes)
    locktime = b'\x00\x00\x00\x00'

    # INPUT:
    #  - Previous TxID (little-endian)
    #  - Previous Vout (4 bytes, little-endian)
    #  - ScriptSig length (varint -> 0 for now, we’ll fill with scriptSig after signing)
    #  - Sequence (4 bytes, e.g. 0xffffffff)
    prev_txid_le = little_endian_hex(prev_txid)
    prev_vout_le = prev_vout.to_bytes(4, 'little')
    sequence = b'\xff\xff\xff\xff'

    # OUTPUT 1: OP_RETURN
    #  - Value (8 bytes, little-endian) = 0 satoshis
    #  - ScriptPubKey = OP_RETURN <message in hex>
    op_return_prefix = b'\x6a'  # OP_RETURN
    msg_hex = message.encode("utf-8")  # raw bytes
    push_len = len(msg_hex)
    # scriptPubKey = OP_RETURN (1 byte) + pushdata length (1 byte) + actual data
    op_return_script = op_return_prefix + push_len.to_bytes(1, 'little') + msg_hex
    op_return_script_len = len(op_return_script)
    value_opreturn = (0).to_bytes(8, 'little')
    op_return_len = op_return_script_len.to_bytes(1, 'little')  # varint (assuming < 0xFD)

    # OUTPUT 2: Change to our address
    # For Testnet P2PKH, version byte is 0x6f, but we’ll reconstruct from pubkey_hash
    # We'll do a standard P2PKH script:
    #   OP_DUP OP_HASH160 <pubKeyHash> OP_EQUALVERIFY OP_CHECKSIG
    #   which is: 76 a9 14 <20-byte-script> 88 ac
    p2pkh_prefix = b'\x76\xa9\x14'
    p2pkh_suffix = b'\x88\xac'
    script_pubkey_p2pkh = p2pkh_prefix + pubkey_hash + p2pkh_suffix
    script_pubkey_len = len(script_pubkey_p2pkh).to_bytes(1, 'little')
    value_change = change_value.to_bytes(8, 'little')

    # Put it all together (unsigned for now).
    raw_tx_unsigned = (
        version
        + in_count
        + prev_txid_le
        + prev_vout_le
        + b'\x00'  # scriptSig length placeholder (0 for unsigned)
        + sequence
        + out_count
        + value_opreturn + op_return_len + op_return_script
        + value_change + script_pubkey_len + script_pubkey_p2pkh
        + locktime
    )

    # We need the sighash for signing:
    # SIGHASH_ALL = 0x01
    sighash_all = b'\x01\x00\x00\x00'
    
    # Construct "transaction + scriptPubKey of the input + SIGHASH_ALL"
    # For P2PKH, we put the redeem script = standard scriptPubKey of that input’s address
    # That script is: OP_DUP OP_HASH160 <pubKeyHash> OP_EQUALVERIFY OP_CHECKSIG
    redeem_script = p2pkh_prefix + pubkey_hash + p2pkh_suffix
    redeem_script_len = len(redeem_script).to_bytes(1, 'little')

    # Rebuild input section with redeem script for the single input
    raw_tx_for_sig = (
        version
        + in_count
        + prev_txid_le
        + prev_vout_le
        + redeem_script_len + redeem_script
        + sequence
        + out_count
        + value_opreturn + op_return_len + op_return_script
        + value_change + script_pubkey_len + script_pubkey_p2pkh
        + locktime
        + sighash_all
    )

    # Double SHA-256
    h = hash256(raw_tx_for_sig)
    # Sign
    signature = sign_transaction(h, privkey_bytes)
    # Append SIGHASH type 0x01
    signature_plus_hashtype = signature + b'\x01'

    # Final scriptSig = <sig> <pubkey>
    script_sig = (
        len(signature_plus_hashtype).to_bytes(1, 'little') + signature_plus_hashtype
        + len(pubkey_bytes).to_bytes(1, 'little') + pubkey_bytes
    )
    script_sig_len = len(script_sig).to_bytes(1, 'little')

    # Now rebuild the final signed transaction:
    raw_tx_final = (
        version
        + in_count
        + prev_txid_le
        + prev_vout_le
        + script_sig_len + script_sig
        + sequence
        + out_count
        + value_opreturn + op_return_len + op_return_script
        + value_change + script_pubkey_len + script_pubkey_p2pkh
        + locktime
    )

    return binascii.hexlify(raw_tx_final).decode('utf-8')

# ---------------------------------------------------------------------------
# (4) Broadcast via BlockCypher (No requests library)
# ---------------------------------------------------------------------------

def broadcast_tx(hex_tx, blockcypher_token):
    """
    Broadcast a raw transaction hex to BlockCypher using urllib.
    """
    url = "https://api.blockcypher.com/v1/btc/test3/txs/push"
    data = {
        "tx": hex_tx,
        "token": blockcypher_token
    }
    data_bytes = json.dumps(data).encode("utf-8")

    req = urllib.request.Request(
        url,
        data=data_bytes,
        headers={"Content-Type": "application/json"}
    )

    try:
        with urllib.request.urlopen(req) as resp:
            body = resp.read().decode("utf-8")
            js = json.loads(body)
            print("Broadcast success!")
            print("Tx Hash:", js.get("tx", {}).get("hash"))
    except urllib.error.HTTPError as e:
        print("HTTP Error:", e.code)
        err_body = e.read().decode("utf-8")
        print("Error response:", err_body)
    except urllib.error.URLError as e:
        print("URL Error:", e.reason)

# ---------------------------------------------------------------------------
# (5) Example Usage (Main)
# ---------------------------------------------------------------------------

def main():
    # -- You must fill these in manually --

    # 1) Your Testnet WIF private key
    PRIV_WIF = "cNbVaR... (Testnet WIF) ..."  

    # 2) The TXID and output index (vout) you control with the above private key.
    #    This must have enough satoshis to cover your outputs + fee.
    PREV_TXID = "xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx"
    PREV_VOUT = 0
    PREV_VALUE = 20000  # satoshis in that UTXO

    # 3) OP_RETURN message
    MESSAGE = "Hello, Craig. Leave me alone."

    # 4) BlockCypher token
    BLOCKCYPHER_TOKEN = "8bd4fa2488614e509a677103b88b95fc"

    # 5) Since we’re sending change back to ourselves, we’ll just
    #    reuse the same private key’s address. But in a real scenario,
    #    you’d derive it from the public key. For demonstration,
    #    we assume you’re controlling that same P2PKH output.
    #    (We do not do an address-derivation snippet here.)
    DESTINATION_ADDRESS = "YourTestnetAddressHere"

    print("Creating Raw Transaction...")
    raw_tx_hex = create_raw_transaction(
        priv_wif=PRIV_WIF,
        prev_txid=PREV_TXID,
        prev_vout=PREV_VOUT,
        prev_value=PREV_VALUE,
        destination_address=DESTINATION_ADDRESS,
        message=MESSAGE,
        nettype="test",
    )

    print("Raw Transaction Hex:", raw_tx_hex)

    print("\nBroadcasting...")
    broadcast_tx(raw_tx_hex, BLOCKCYPHER_TOKEN)

if __name__ == "__main__":
    main()

Enhanced Script
Here’s a refactored version of your script tailored for the Cosmos SDK:

python
Copy code
#!/usr/bin/env python3
"""
Cosmos SDK Transaction Creator & Broadcaster
With a nod to "Hello, Craig. Leave me alone, you Satoshi imposter!"
"""

import json
import requests
from hashlib import sha256
import bech32

# -----------------------------------------------------
# Cosmos SDK Utilities
# -----------------------------------------------------

def sign_tx(unsigned_tx, privkey_hex):
    """
    Sign the transaction using SHA-256 hashing and ECDSA with Cosmos secp256k1 keys.
    """
    import ecdsa

    privkey_bytes = bytes.fromhex(privkey_hex)
    signing_key = ecdsa.SigningKey.from_string(privkey_bytes, curve=ecdsa.SECP256k1)
    hash_ = sha256(unsigned_tx.encode()).digest()
    signature = signing_key.sign_digest(hash_, sigencode=ecdsa.util.sigencode_der)
    return signature.hex()

def create_tx(sender, recipient, amount, denom, memo, chain_id, account_number, sequence):
    """
    Create a Cosmos transaction in JSON format.
    """
    unsigned_tx = {
        "body": {
            "messages": [
                {
                    "@type": "/cosmos.bank.v1beta1.MsgSend",
                    "from_address": sender,
                    "to_address": recipient,
                    "amount": [{"denom": denom, "amount": str(amount)}],
                }
            ],
            "memo": memo,
            "timeout_height": "0",
            "extension_options": [],
            "non_critical_extension_options": []
        },
        "auth_info": {
            "signer_infos": [
                {
                    "public_key": {
                        "@type": "/cosmos.crypto.secp256k1.PubKey",
                        "key": "",  # Fill in public key later
                    },
                    "mode_info": {"single": {"mode": "SIGN_MODE_DIRECT"}},
                    "sequence": str(sequence),
                }
            ],
            "fee": {
                "amount": [{"denom": denom, "amount": "500"}],  # Example fee
                "gas_limit": "200000",
            },
        },
        "signatures": [""],  # Fill in after signing
    }

    return unsigned_tx

def broadcast_tx(tx, node_url):
    """
    Broadcast the signed transaction using REST endpoint.
    """
    broadcast_url = f"{node_url}/cosmos/tx/v1beta1/txs"
    data = {"tx_bytes": tx, "mode": "BROADCAST_MODE_BLOCK"}
    response = requests.post(broadcast_url, json=data)

    if response.status_code == 200:
        print("Broadcast Success:", response.json())
    else:
        print("Broadcast Failed:", response.text)

# -----------------------------------------------------
# Main Function
# -----------------------------------------------------

def main():
    # User Inputs
    SENDER = "cosmos1youraddresshere"
    RECIPIENT = "cosmos1recipientaddress"
    PRIVKEY_HEX = "your_private_key_in_hex"
    DENOM = "uatom"  # Example: ATOM
    AMOUNT = 100000  # 100000 uatom = 0.1 ATOM
    MEMO = "Hello, Craig. Leave me alone, you Satoshi imposter!"
    NODE_URL = "https://rpc.cosmos.network"
    CHAIN_ID = "cosmoshub-4"
    ACCOUNT_NUMBER = 12345
    SEQUENCE = 0

    # Create unsigned transaction
    print("Creating Transaction...")
    unsigned_tx = create_tx(
        sender=SENDER,
        recipient=RECIPIENT,
        amount=AMOUNT,
        denom=DENOM,
        memo=MEMO,
        chain_id=CHAIN_ID,
        account_number=ACCOUNT_NUMBER,
        sequence=SEQUENCE,
    )
    print("Unsigned TX:", json.dumps(unsigned_tx, indent=2))

    # Sign the transaction
    print("Signing Transaction...")
    signature = sign_tx(json.dumps(unsigned_tx), PRIVKEY_HEX)
    unsigned_tx["signatures"][0] = signature
    print("Signed TX:", json.dumps(unsigned_tx, indent=2))

    # Broadcast the transaction
    print("Broadcasting Transaction...")
    broadcast_tx(unsigned_tx, NODE_URL)

if __name__ == "__main__":
    main()
Use base images for C++ development
FROM mcr.microsoft.com/devcontainers/cpp:1-ubuntu-24.04 AS ubuntu-base
FROM mcr.microsoft.com/dotnet/framework/sdk:4.8-windowsservercore-ltsc2022 AS windows-base

# Ubuntu Environment Setup
FROM ubuntu-base AS ubuntu-setup
ARG REINSTALL_CMAKE_VERSION_FROM_SOURCE="none"
COPY ./reinstall-cmake.sh /tmp/
RUN if [ "${REINSTALL_CMAKE_VERSION_FROM_SOURCE}" != "none" ]; then \
        chmod +x /tmp/reinstall-cmake.sh && /tmp/reinstall-cmake.sh ${REINSTALL_CMAKE_VERSION_FROM_SOURCE}; \
    fi \
    && rm -f /tmp/reinstall-cmake.sh \
    && apt-get update && export DEBIAN_FRONTEND=noninteractive \
    && apt-get -y install --no-install-recommends \
       python3-pip \
       nodejs \
       npm \
       openjdk-17-jdk \
       gdb \
       valgrind \
       lsof \
       git \
       clang-18 \
       libstdc++-12-dev \
       glibc-source \
    && apt-get clean && rm -rf /var/lib/apt/lists/*

# Python setup
RUN python3 -m pip install --upgrade pip

# Node.js setup
RUN npm install -g yarn

# Install vcpkg if not already present
ENV VCPKG_INSTALLATION_ROOT=/vcpkg
RUN git clone https://github.com/microsoft/vcpkg.git $VCPKG_INSTALLATION_ROOT \
    && cd $VCPKG_INSTALLATION_ROOT \
    && ./bootstrap-vcpkg.sh

# Copy project files into the container
COPY . /workspace
WORKDIR /workspace
CMD ["bash"]

# Windows Environment Setup
FROM windows-base AS windows-setup
SHELL ["powershell", "-Command", "$ErrorActionPreference = 'Stop'; $ProgressPreference = 'SilentlyContinue';"]
RUN iex ((New-Object System.Net.WebClient).DownloadString('https://chocolatey.org/install.ps1')); \
    choco install -y \
Comment on lines +48 to +49
Contributor
@coderabbitai coderabbitai bot 2 hours ago
⚠️ Potential issue

Use secure download method for Chocolatey installation

Using iex with downloaded content from the internet is potentially unsafe. Use the official installation method.

-RUN iex ((New-Object System.Net.WebClient).DownloadString('https://chocolatey.org/install.ps1')); \
+RUN Set-ExecutionPolicy Bypass -Scope Process -Force; \
+    [System.Net.ServicePointManager]::SecurityProtocol = [System.Net.ServicePointManager]::SecurityProtocol -bor 3072; \
+    iex ((New-Object System.Net.WebClient).DownloadString('https://community.chocolatey.org/install.ps1')); \
📝 Committable suggestion
@bearycool11	Reply...
    msys2 \
    cmake \
    clang \
    python \
    nodejs \
    git \
    jdk17 \
    visualstudio2022buildtools --package-parameters "--add Microsoft.VisualStudio.Workload.VCTools --includeRecommended"

# Setup environment variables
ENV PATH="${PATH};C:\msys64\usr\bin;C:\Program Files\Git\cmd"

# Install vcpkg for Windows
RUN git clone https://github.com/microsoft/vcpkg.git C:\vcpkg \
    && cd C:\vcpkg \
    && .\bootstrap-vcpkg.bat

# Copy project files into the container
COPY . C:\workspace
WORKDIR C:\workspace
CMD ["powershell"]

Comment on lines +1 to +71
Contributor
@coderabbitai coderabbitai bot 2 hours ago
⚠️ Potential issue

Separate Dockerfile content into dedicated files

The file incorrectly combines Dockerfile content with shell script content. This should be split into separate files:

Dockerfile.ubuntu
Dockerfile.windows
Move the Dockerfile content (lines 1-71) to their respective files and keep only the shell script content in this file.

🧰 Tools
@bearycool11	Reply...
##perform a double check second run using this

#!/bin/bash

# Script to build and run Runner Images for Ubuntu 24.04 and Windows Server 2025 debugging
# with Clang setup and Cosmos SDK integration.

# Variables
UBUNTU_IMAGE_NAME="runner-images-ubuntu-24.04"
WINDOWS_IMAGE_NAME="runner-images-windows-2025"
CONTAINER_NAME="runner-images-container"
UBUNTU_DOCKERFILE_PATH="./Dockerfile.ubuntu" # Adjust if Dockerfile for Ubuntu is in a different location
WINDOWS_DOCKERFILE_PATH="./Dockerfile.windows" # Adjust if Dockerfile for Windows is in a different location
CONTEXT_DIR="." # Adjust if the context is a different directory
WORKSPACE_DIR="$(pwd)" # Current directory as the workspace
UBUNTU_CLANGFILE_PATH="clangfile.ubuntu.json"
WINDOWS_CLANGFILE_PATH="clangfile.windows.json"
LOG_FILE="runner-images-build.log"

# JSON File Paths
CHAIN_INFO_JSON="chain_info_mainnets.json"
IBC_INFO_JSON="ibc_info.json"
ASSET_LIST_JSON="asset_list_mainnets.json"
COSMWASM_MSGS_JSON="cosmwasm_json_msgs.json"
OSMOSIS_MSGS_JSON="osmosis_json_msgs.json"

# Functions

# Cleanup Function
cleanup() {
    echo "[$(date +'%Y-%m-%d %H:%M:%S')] Cleaning up any existing container with the same name..."
    if docker rm -f ${CONTAINER_NAME} 2>/dev/null; then
        echo "[$(date +'%Y-%m-%d %H:%M:%S')] Container ${CONTAINER_NAME} successfully removed."
    else
        echo "[$(date +'%Y-%m-%d %H:%M:%S')] No container named ${CONTAINER_NAME} found or removal failed."
    fi
}

# Build Image Function
build_image() {
    local image_name="$1"
    local dockerfile_path="$2"
    local clangfile_path="$3"
    echo "[$(date +'%Y-%m-%d %H:%M:%S')] Building Docker image: ${image_name}..."
    if docker build -t ${image_name} -f ${dockerfile_path} --build-arg CLANGFILE=${clangfile_path} ${CONTEXT_DIR} | tee -a ${LOG_FILE}; then
        echo "[$(date +'%Y-%m-%d %H:%M:%S')] Docker image ${image_name} built successfully."
    else
        echo "[$(date +'%Y-%m-%d %H:%M:%S')] ERROR: Docker image build for ${image_name} failed. Check ${LOG_FILE} for details."
        exit 1
    fi
}

# Run Container Function
run_container() {
    local image_name="$1"
    echo "[$(date +'%Y-%m-%d %H:%M:%S')] Running Docker container: ${CONTAINER_NAME} for ${image_name}..."
    docker run -it --rm \
        --name ${CONTAINER_NAME} \
        --mount type=bind,source=${WORKSPACE_DIR},target=/workspace \
        --network none \
        ${image_name}
    if [ $? -eq 0 ]; then
        echo "[$(date +'%Y-%m-%d %H:%M:%S')] Container ${CONTAINER_NAME} for ${image_name} ran successfully."
    else
        echo "[$(date +'%Y-%m-%d %H:%M:%S')] ERROR: Failed to run container ${CONTAINER_NAME} for ${image_name}."
        exit 1
    fi
}

# Validate JSON Configurations
validate_json_files() {
    echo "[$(date +'%Y-%m-%d %H:%M:%S')] Validating JSON configurations..."
    for file in $CHAIN_INFO_JSON $IBC_INFO_JSON $ASSET_LIST_JSON $COSMWASM_MSGS_JSON $OSMOSIS_MSGS_JSON; do
        if jq empty $file >/dev/null 2>&1; then
            echo "[$(date +'%Y-%m-%d %H:%M:%S')] $file is valid."
        else
            echo "[$(date +'%Y-%m-%d %H:%M:%S')] ERROR: $file is not valid JSON."
            exit 1
        fi
    done
}

# Main Execution Workflow
echo "[$(date +'%Y-%m-%d %H:%M:%S')] Starting Runner Image Setup for Ubuntu 24.04 and Windows Server 2025 with Clang configurations..."

# Validate JSON configurations
validate_json_files

# Clean up any previous runs
cleanup

# Build the Ubuntu Docker image with Clang configuration
build_image ${UBUNTU_IMAGE_NAME} ${UBUNTU_DOCKERFILE_PATH} ${UBUNTU_CLANGFILE_PATH}

# Run the Ubuntu container
run_container ${UBUNTU_IMAGE_NAME}

# Build the Windows Docker image with Clang configuration
build_image ${WINDOWS_IMAGE_NAME} ${WINDOWS_DOCKERFILE_PATH} ${WINDOWS_CLANGFILE_PATH}

# Run the Windows container
run_container ${WINDOWS_IMAGE_NAME}

echo "[$(date +'%Y-%m-%d %H:%M:%S')] Runner Image Setup for both Ubuntu 24.04 and Windows Server 2025 with Clang configurations completed."
Comment on lines +74 to +175
Contributor
@coderabbitai coderabbitai bot 2 hours ago
🛠️ Refactor suggestion

Consolidate duplicate functionality

This script duplicates functionality from Ubunutu_Windows_Image_SDK_Updater.sh. Consider:

Creating a shared library for common functions
Having one script import from the other
Create a new file common.sh:

#!/bin/bash
# common.sh - Shared functions for Docker container management

cleanup() {
    # ... (existing cleanup function)
}

build_image() {
    # ... (existing build_image function)
}

run_container() {
    # ... (existing run_container function)
}

validate_json_files() {
    # ... (existing validate_json_files function)
}
Then source it in both scripts:

+#!/bin/bash
+source ./common.sh
@bearycool11	Reply...
  Binary file addedBIN +31.9 KB 
PMLL_Blockchain_Assembly-main.zip
Viewed
Binary file not shown.
  117 changes: 117 additions & 0 deletions117  
Ubunutu_Windows_Image_SDK_Updater.sh
Viewed
Original file line number	Diff line number	Diff line change
@@ -0,0 +1,117 @@
#!/bin/bash

# Script to build and run Runner Images for Ubuntu 24.04 and Windows Server 2025 debugging
# with Clang setup and Cosmos SDK integration.

# Variables
UBUNTU_IMAGE_NAME="runner-images-ubuntu-24.04"
WINDOWS_IMAGE_NAME="runner-images-windows-2025"
Comment on lines +7 to +8
Contributor
@coderabbitai coderabbitai bot 2 hours ago
⚠️ Potential issue

Fix typo in image name and use correct Windows Server version

"Ubunutu" is misspelled in the script name and variables
Windows Server 2025 is not yet released
-UBUNTU_IMAGE_NAME="runner-images-ubuntu-24.04"
-WINDOWS_IMAGE_NAME="runner-images-windows-2025"
+UBUNTU_IMAGE_NAME="runner-images-ubuntu-22.04"
+WINDOWS_IMAGE_NAME="runner-images-windows-2022"
📝 Committable suggestion
@bearycool11	Reply...
CONTAINER_NAME="runner-images-container"
UBUNTU_DOCKERFILE_PATH="./Dockerfile.ubuntu" # Adjust if Dockerfile for Ubuntu is in a different location
WINDOWS_DOCKERFILE_PATH="./Dockerfile.windows" # Adjust if Dockerfile for Windows is in a different location
CONTEXT_DIR="." # Adjust if the context is a different directory
WORKSPACE_DIR="$(pwd)" # Current directory as the workspace
UBUNTU_CLANGFILE_PATH="clangfile.ubuntu.json"
WINDOWS_CLANGFILE_PATH="clangfile.windows.json"
LOG_FILE="runner-images-build.log"

# JSON File Paths
CHAIN_INFO_JSON="chain_info_mainnets.json"
IBC_INFO_JSON="ibc_info.json"
ASSET_LIST_JSON="asset_list_mainnets.json"
COSMWASM_MSGS_JSON="cosmwasm_json_msgs.json"
OSMOSIS_MSGS_JSON="osmosis_json_msgs.json"

# Ensure required tools are installed
command -v docker >/dev/null 2>&1 || { echo >&2 "docker is required but it's not installed. Aborting."; exit 1; }
command -v jq >/dev/null 2>&1 || { echo >&2 "jq is required but it's not installed. Aborting."; exit 1; }

# Ensure Docker daemon is running
if ! docker info >/dev/null 2>&1; then
    echo "Docker daemon is not running. Please start Docker and try again."
    exit 1
fi

# Cleanup Function
cleanup() {
    echo "[$(date +'%Y-%m-%d %H:%M:%S')] Cleaning up any existing container with the same name..."
    if docker rm -f ${CONTAINER_NAME} 2>/dev/null; then
        echo "[$(date +'%Y-%m-%d %H:%M:%S')] Container ${CONTAINER_NAME} successfully removed."
    else
        echo "[$(date +'%Y-%m-%d %H:%M:%S')] No container named ${CONTAINER_NAME} found or removal failed."
    fi
Comment on lines +38 to +42
Contributor
@coderabbitai coderabbitai bot 2 hours ago
⚠️ Potential issue

Quote variables in Docker commands to prevent word splitting

Unquoted variables in Docker commands can lead to issues with spaces or special characters.

-    if docker rm -f ${CONTAINER_NAME} 2>/dev/null; then
-        echo "[$(date +'%Y-%m-%d %H:%M:%S')] Container ${CONTAINER_NAME} successfully removed."
+    if docker rm -f "${CONTAINER_NAME}" 2>/dev/null; then
+        echo "[$(date +'%Y-%m-%d %H:%M:%S')] Container '${CONTAINER_NAME}' successfully removed."
Committable suggestion skipped: line range outside the PR's diff.

@bearycool11	Reply...
}

# Build Image Function with retry logic
build_image() {
    local image_name="$1"
    local dockerfile_path="$2"
    local clangfile_path="$3"
    echo "[$(date +'%Y-%m-%d %H:%M:%S')] Building Docker image: ${image_name}..."
    for i in {1..3}; do
        if docker build -t ${image_name} -f ${dockerfile_path} --build-arg CLANGFILE=${clangfile_path} ${CONTEXT_DIR} | tee -a ${LOG_FILE}; then
            echo "[$(date +'%Y-%m-%d %H:%M:%S')] Docker image ${image_name} built successfully."
            return 0
        else
            echo "[$(date +'%Y-%m-%d %H:%M:%S')] ERROR: Docker image build for ${image_name} failed. Retry $i/3."
            sleep 5
        fi
    done
    echo "[$(date +'%Y-%m-%d %H:%M:%S')] ERROR: Docker image build for ${image_name} failed after 3 attempts. Check ${LOG_FILE} for details."
    exit 1
}

# Run Container Function
run_container() {
    local image_name="$1"
    echo "[$(date +'%Y-%m-%d %H:%M:%S')] Running Docker container: ${CONTAINER_NAME} for ${image_name}..."
    if docker run -it --rm \
        --name ${CONTAINER_NAME} \
        --mount type=bind,source=${WORKSPACE_DIR},target=/workspace \
        --network none \
        ${image_name}; then
        echo "[$(date +'%Y-%m-%d %H:%M:%S')] Container ${CONTAINER_NAME} for ${image_name} ran successfully."
    else
        echo "[$(date +'%Y-%m-%d %H:%M:%S')] ERROR: Failed to run container ${CONTAINER_NAME} for ${image_name}."
        exit 1
    fi
Comment on lines +68 to +77
Contributor
@coderabbitai coderabbitai bot 2 hours ago
🛠️ Refactor suggestion

Reconsider interactive mode and network isolation

The container is run with:

Interactive mode (-it) which requires a TTY and might not be suitable for CI/CD
Network disabled (--network none) which might prevent necessary operations
-    if docker run -it --rm \
+    if docker run -i --rm \
         --name ${CONTAINER_NAME} \
         --mount type=bind,source=${WORKSPACE_DIR},target=/workspace \
-        --network none \
         ${image_name}; then
📝 Committable suggestion
@bearycool11	Reply...
}

# Validate JSON Configurations
validate_json_files() {
    echo "[$(date +'%Y-%m-%d %H:%M:%S')] Validating JSON configurations..."
    for file in $CHAIN_INFO_JSON $IBC_INFO_JSON $ASSET_LIST_JSON $COSMWASM_MSGS_JSON $OSMOSIS_MSGS_JSON; do
        if jq empty $file >/dev/null 2>&1; then
            echo "[$(date +'%Y-%m-%d %H:%M:%S')] $file is valid."
        else
            echo "[$(date +'%Y-%m-%d %H:%M:%S')] ERROR: $file is not valid JSON."
            exit 1
        fi
    done
}

# Trap exit signals to clean up
trap cleanup EXIT

# Main Execution Workflow
echo "[$(date +'%Y-%m-%d %H:%M:%S')] Starting Runner Image Setup for Ubuntu 24.04 and Windows Server 2025 with Clang configurations..." | tee -a ${LOG_FILE}

# Validate JSON configurations
validate_json_files

# Clean up any previous runs
cleanup

# Build the Ubuntu Docker image with Clang configuration
build_image ${UBUNTU_IMAGE_NAME} ${UBUNTU_DOCKERFILE_PATH} ${UBUNTU_CLANGFILE_PATH}

# Run the Ubuntu container
run_container ${UBUNTU_IMAGE_NAME}

# Build the Windows Docker image with Clang configuration
build_image ${WINDOWS_IMAGE_NAME} ${WINDOWS_DOCKERFILE_PATH} ${WINDOWS_CLANGFILE_PATH}

# Run the Windows container
run_container ${WINDOWS_IMAGE_NAME}

echo "[$(date +'%Y-%m-%d %H:%M:%S')] Runner Image Setup for both Ubuntu 24.04 and Windows Server 2025 with Clang configurations completed." | tee -a ${LOG_FILE}
  Binary file addedBIN +924 KB 
ZBar-master.zip
Viewed
Binary file not shown.
  341 changes: 341 additions & 0 deletions341  
dockerclangfile.sh
Viewed
Original file line number	Diff line number	Diff line change
@@ -0,0 +1,341 @@
# Use base images for C++ development
Contributor
@coderabbitai coderabbitai bot 2 hours ago
⚠️ Potential issue

Add shebang directive and file description

The script lacks a proper shebang directive and file documentation.

Add the following at the beginning of the file:

+#!/usr/bin/env bash
+
+# dockerclangfile.sh - Script to build and run Docker images for Ubuntu and Windows
+# with Clang setup and Cosmos SDK integration
+
+set -euo pipefail
📝 Committable suggestion
🧰 Tools
@bearycool11	Reply...
FROM mcr.microsoft.com/devcontainers/cpp:1-ubuntu-24.04 AS ubuntu-base
FROM mcr.microsoft.com/dotnet/framework/sdk:4.8-windowsservercore-ltsc2022 AS windows-base

# Ubuntu Environment Setup
FROM ubuntu-base AS ubuntu-setup
ARG REINSTALL_CMAKE_VERSION_FROM_SOURCE="none"
COPY ./reinstall-cmake.sh /tmp/
RUN if [ "${REINSTALL_CMAKE_VERSION_FROM_SOURCE}" != "none" ]; then \
        chmod +x /tmp/reinstall-cmake.sh && /tmp/reinstall-cmake.sh ${REINSTALL_CMAKE_VERSION_FROM_SOURCE}; \
    fi \
    && rm -f /tmp/reinstall-cmake.sh \
    && apt-get update && export DEBIAN_FRONTEND=noninteractive \
    && apt-get -y install --no-install-recommends \
       python3-pip \
       nodejs \
       npm \
       openjdk-17-jdk \
       gdb \
       valgrind \
       lsof \
       git \
       clang-18 \
       libstdc++-12-dev \
       glibc-source \
    && apt-get clean && rm -rf /var/lib/apt/lists/*

# Python setup
RUN python3 -m pip install --upgrade pip

# Node.js setup
RUN npm install -g yarn

# Install vcpkg if not already present
ENV VCPKG_INSTALLATION_ROOT=/vcpkg
RUN git clone https://github.com/microsoft/vcpkg.git $VCPKG_INSTALLATION_ROOT \
    && cd $VCPKG_INSTALLATION_ROOT \
    && ./bootstrap-vcpkg.sh

# Copy project files into the container
COPY . /workspace
W…
Copy link
Contributor

coderabbitai bot commented Jan 3, 2025

📝 Walkthrough

Walkthrough

A comprehensive set of scripts and workflows has been introduced to enhance development and continuous integration processes. The changes include a new GitHub Actions workflow for cross-platform Go project testing, a Python script for Bitcoin transaction handling, and multiple shell scripts for Docker-based development environments targeting Ubuntu and Windows platforms. These scripts focus on setting up development environments, automating build processes, and facilitating SDK updates with robust error handling and configuration validation.

Changes

File Change Summary
.github/workflows/Cosmo.Mac.Ubuntu.Window.go.yml New GitHub Actions workflow for Go project testing across Ubuntu, macOS, and Windows, including jobs for building and testing Go projects and .NET projects.
Broadcasting.sh New Python script for Bitcoin Testnet transaction creation, signing, and broadcasting with ECDSA implementation.
Cosmos_sdk_updater.sh New shell script for Docker container setup for C++ development on Ubuntu and Windows.
Ubunutu_Windows_Image_SDK_Updater.sh New shell script for building and running Docker images with debugging and Cosmos SDK integration.
dockerclangfile.sh New Docker setup script for C++ development environments with Clang on Ubuntu and Windows, including error handling and JSON validation.
Build_&_Go_SDK_Proto_Run.YML New GitHub Actions workflow for building and pushing a SimApp and SDK Proto Builder, with support for multiple architectures.
mergify.yml Updated with new pull request rules for backporting changes and automerging to the main branch.
.github/workflows/c-cpp.yml New GitHub Actions workflow for automating the CI process for C/C++ projects.

Sequence Diagram

sequenceDiagram
    participant GA as GitHub Actions
    participant Build as Build & Test Job
    participant Go as Go Environment
    participant SDK as Cosmos SDK

    GA->>Build: Trigger on push/PR
    Build->>Go: Setup Go environment
    Go->>SDK: Fetch latest release version
    SDK-->>Go: Return version
    Go->>Go: Update modules
    Go->>Build: Build project
    Build->>Build: Run tests
    Build-->>GA: Report status
Loading

The sequence diagram illustrates the GitHub Actions workflow for the Go project, showing the process of setting up the environment, fetching the Cosmos SDK, updating modules, building the project, and running tests.

Possibly related PRs

Suggested reviewers

  • julienrbrt
  • sontrinh16

Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media?

❤️ Share
🪧 Tips

Chat

There are 3 ways to chat with CodeRabbit:

  • Review comments: Directly reply to a review comment made by CodeRabbit. Example:
    • I pushed a fix in commit <commit_id>, please review it.
    • Generate unit testing code for this file.
    • Open a follow-up GitHub issue for this discussion.
  • Files and specific lines of code (under the "Files changed" tab): Tag @coderabbitai in a new review comment at the desired location with your query. Examples:
    • @coderabbitai generate unit testing code for this file.
    • @coderabbitai modularize this function.
  • PR comments: Tag @coderabbitai in a new PR comment to ask questions about the PR branch. For the best results, please provide a very specific query, as very limited context is provided in this mode. Examples:
    • @coderabbitai gather interesting stats about this repository and render them as a table. Additionally, render a pie chart showing the language distribution in the codebase.
    • @coderabbitai read src/utils.ts and generate unit testing code.
    • @coderabbitai read the files in the src/scheduler package and generate a class diagram using mermaid and a README in the markdown format.
    • @coderabbitai help me debug CodeRabbit configuration file.

Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments.

CodeRabbit Commands (Invoked using PR comments)

  • @coderabbitai pause to pause the reviews on a PR.
  • @coderabbitai resume to resume the paused reviews.
  • @coderabbitai review to trigger an incremental review. This is useful when automatic reviews are disabled for the repository.
  • @coderabbitai full review to do a full review from scratch and review all the files again.
  • @coderabbitai summary to regenerate the summary of the PR.
  • @coderabbitai generate docstrings to generate docstrings for this PR. (Beta)
  • @coderabbitai resolve resolve all the CodeRabbit review comments.
  • @coderabbitai configuration to show the current CodeRabbit configuration for the repository.
  • @coderabbitai help to get help.

Other keywords and placeholders

  • Add @coderabbitai ignore anywhere in the PR description to prevent this PR from being reviewed.
  • Add @coderabbitai summary to generate the high-level summary at a specific location in the PR description.
  • Add @coderabbitai anywhere in the PR title to generate the title automatically.

Documentation and Community

  • Visit our Documentation for detailed information on how to use CodeRabbit.
  • Join our Discord Community to get help, request features, and share feedback.
  • Follow us on X/Twitter for updates and announcements.

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 14

🔭 Outside diff range comments (2)
dockerclangfile.sh (2)

Line range hint 2009-2010: Fix network isolation implementation

The network isolation comment contains a syntax error that breaks the Docker run command.

Fix the network isolation implementation:

-        --network none \  # Ensures no network access for isolation
+        --network none \
🧰 Tools
🪛 Shellcheck (0.10.0)

[error] 1-1: Tips depend on target shell and yours is unknown. Add a shebang or a 'shell' directive.

(SC2148)


[error] 9-9: Parsing stopped here. Is this keyword correctly matched up?

(SC1089)


Line range hint 447-449: Update image versions to use released versions

The script uses unreleased versions of Ubuntu and Windows Server.

Use currently available versions:

-UBUNTU_IMAGE_NAME="runner-images-ubuntu-24.04"
-WINDOWS_IMAGE_NAME="runner-images-windows-2025"
+UBUNTU_IMAGE_NAME="runner-images-ubuntu-22.04"
+WINDOWS_IMAGE_NAME="runner-images-windows-2022"
🧰 Tools
🪛 Shellcheck (0.10.0)

[error] 1-1: Tips depend on target shell and yours is unknown. Add a shebang or a 'shell' directive.

(SC2148)


[error] 9-9: Parsing stopped here. Is this keyword correctly matched up?

(SC1089)

🧹 Nitpick comments (2)
.github/workflows/Cosmo.Mac.Ubuntu.Window.go.yml (1)

21-21: Consider using Go 1.21 for better performance and features

Go 1.20 is not the latest stable version. Consider upgrading to Go 1.21 for improved performance, security updates, and new features.

-        go-version: '1.20'  # You can specify a version range or use 'stable' if you want the latest stable version
+        go-version: '1.21'  # Using latest stable version for improved performance and security
Broadcasting.sh (1)

2693-2720: Enhance error handling in broadcast_tx function

The error handling could be more robust and informative.

Consider these improvements:

 def broadcast_tx(hex_tx, blockcypher_token):
+    if not hex_tx or not blockcypher_token:
+        raise ValueError("hex_tx and blockcypher_token are required")
+
     url = "https://api.blockcypher.com/v1/btc/test3/txs/push"
     data = {
         "tx": hex_tx,
         "token": blockcypher_token
     }
     data_bytes = json.dumps(data).encode("utf-8")
+    
+    timeout = 30  # Add timeout
     req = urllib.request.Request(
         url,
         data=data_bytes,
         headers={"Content-Type": "application/json"}
     )
     try:
-        with urllib.request.urlopen(req) as resp:
+        with urllib.request.urlopen(req, timeout=timeout) as resp:
             body = resp.read().decode("utf-8")
             js = json.loads(body)
             print("Broadcast success!")
             print("Tx Hash:", js.get("tx", {}).get("hash"))
+            return js
     except urllib.error.HTTPError as e:
         print("HTTP Error:", e.code)
         err_body = e.read().decode("utf-8")
         print("Error response:", err_body)
+        raise
     except urllib.error.URLError as e:
         print("URL Error:", e.reason)
+        raise
+    except TimeoutError:
+        print("Request timed out")
+        raise
📜 Review details

Configuration used: .coderabbit.yml
Review profile: CHILL
Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 07d5168 and b936116.

⛔ Files ignored due to path filters (3)
  • PMLL_Blockchain_Assembly-main.zip is excluded by !**/*.zip
  • ZBar-master.zip is excluded by !**/*.zip
  • pmll_blockchain-main (8).zip is excluded by !**/*.zip
📒 Files selected for processing (5)
  • .github/workflows/Cosmo.Mac.Ubuntu.Window.go.yml (1 hunks)
  • Broadcasting.sh (1 hunks)
  • Cosmos_sdk_updater.sh (1 hunks)
  • Ubunutu_Windows_Image_SDK_Updater.sh (1 hunks)
  • dockerclangfile.sh (1 hunks)
🧰 Additional context used
🪛 actionlint (1.7.4)
.github/workflows/Cosmo.Mac.Ubuntu.Window.go.yml

26-26: shellcheck reported issue in this script: SC2086:info:4:42: Double quote to prevent globbing and word splitting

(shellcheck)

🪛 Shellcheck (0.10.0)
Cosmos_sdk_updater.sh

[error] 1-1: Tips depend on target shell and yours is unknown. Add a shebang or a 'shell' directive.

(SC2148)


[error] 9-9: Parsing stopped here. Is this keyword correctly matched up?

(SC1089)

Broadcasting.sh

[error] 1-1: ShellCheck only supports sh/bash/dash/ksh/'busybox sh' scripts. Sorry!

(SC1071)

dockerclangfile.sh

[error] 1-1: Tips depend on target shell and yours is unknown. Add a shebang or a 'shell' directive.

(SC2148)


[error] 9-9: Parsing stopped here. Is this keyword correctly matched up?

(SC1089)

🪛 Gitleaks (8.21.2)
Broadcasting.sh

410-410: Detected a Generic API Key, potentially exposing access to various services and sensitive operations.

(generic-api-key)


1244-1244: Detected a Generic API Key, potentially exposing access to various services and sensitive operations.

(generic-api-key)


2735-2735: Detected a Generic API Key, potentially exposing access to various services and sensitive operations.

(generic-api-key)

🔇 Additional comments (2)
Ubunutu_Windows_Image_SDK_Updater.sh (2)

71-71: ⚠️ Potential issue

Add volume mount permissions for Windows containers

The Windows container mount requires explicit permissions for proper operation.

-        ${image_name}; then
+        --security-opt "credentialspec=file://default.json" \
+        ${image_name}; then

Likely invalid or redundant comment.


83-84: 🛠️ Refactor suggestion

Validate Dockerfile paths

Add validation for Dockerfile existence before attempting to build.

 UBUNTU_DOCKERFILE_PATH="./Dockerfile.ubuntu" # Adjust if Dockerfile for Ubuntu is in a different location
 WINDOWS_DOCKERFILE_PATH="./Dockerfile.windows" # Adjust if Dockerfile for Windows is in a different location
+
+# Validate Dockerfile paths
+for dockerfile in "$UBUNTU_DOCKERFILE_PATH" "$WINDOWS_DOCKERFILE_PATH"; do
+    if [ ! -f "$dockerfile" ]; then
+        echo "ERROR: Dockerfile not found at $dockerfile"
+        exit 1
+    fi
+done

Likely invalid or redundant comment.

Ubunutu_Windows_Image_SDK_Updater.sh Show resolved Hide resolved
Ubunutu_Windows_Image_SDK_Updater.sh Show resolved Hide resolved
Broadcasting.sh Show resolved Hide resolved
Broadcasting.sh Show resolved Hide resolved
Broadcasting.sh Show resolved Hide resolved
dockerclangfile.sh Show resolved Hide resolved
dockerclangfile.sh Show resolved Hide resolved
@bearycool11 bearycool11 changed the title Bitcoin is now onboarded and able to broadcast with cosmos-sdk-broadcast.sh feat: Bitcoin is now onboarded and able to broadcast with cosmos-sdk-broadcast.sh Jan 4, 2025
@bearycool11 bearycool11 changed the title feat: Bitcoin is now onboarded and able to broadcast with cosmos-sdk-broadcast.sh Feat & Fix: Bitcoin is now onboarded and able to broadcast with cosmos-sdk-broadcast.sh Jan 4, 2025
Copy link
Author

@bearycool11 bearycool11 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

name: Go

on:
push:
branches: [ "main" ]
pull_request:
branches: [ "main" ]

permissions:
contents: read
pull-requests: write

jobs:
build-and-test:
strategy:
matrix:
os: [ubuntu-24.04, macos-latest, windows-latest]
runs-on: ${{ matrix.os }}
steps:
- uses: actions/checkout@v4

- name: Set up Go
  uses: actions/[email protected]
  with:
    go-version: '1.20'  # You can specify a version range or use 'stable' if you want the latest stable version
    cache: true
    cache-dependency-path: go.sum

- name: Fetch Latest Cosmos SDK Version
  run: |
    # Fetch the latest release tag from GitHub
    LATEST_VERSION=$(curl -s "https://api.github.com/repos/cosmos/cosmos-sdk/releases/latest" | grep '"tag_name":' | sed -E 's/.*"([^"]+)".*/\1/')
    echo "Latest Cosmos SDK version: $LATEST_VERSION"
    echo "COSMOS_VERSION=$LATEST_VERSION" >> $GITHUB_ENV

- name: Update Go Modules
  run: |
    # Update go.mod with the latest Cosmos SDK version
    go get github.com/cosmos/cosmos-sdk@${{ env.COSMOS_VERSION }}
    go mod tidy

- name: Build
  run: go build -v ./...

- name: Test
  run: go test -v ./...

- name: Check PR Title for Semantic Compliance
  uses: amannn/[email protected]
  with:
    githubBaseUrl: https://api.github.com

- name: Add Sticky Pull Request Comment
  uses: marocchino/sticky-pull-request-comment@v2
  with:
    header: pr-title-lint-error
    message: |
      Hey there and thank you for opening this pull request! 👋🏼

      We require pull request titles to follow the [Conventional Commits specification](https://www.conventionalcommits.org/en/v1.0.0/) and it looks like your proposed title needs to be adjusted.

      Details:

      ```
      No release type found in pull request title "{{ github.event.pull_request.title }}". Add a prefix to indicate what kind of release this pull request corresponds to. For reference, see https://www.conventionalcommits.org/

      Available types:
       - feat: A new feature
       - fix: A bug fix
       - docs: Documentation only changes
       - style: Changes that do not affect the meaning of the code (white-space, formatting, missing semi-colons, etc)
       - refactor: A code change that neither fixes a bug nor adds a feature
       - perf: A code change that improves performance
       - test: Adding missing tests or correcting existing tests
       - build: Changes that affect the build system or external dependencies (example scopes: gulp, broccoli, npm)
       - ci: Changes to our CI configuration files and scripts (example scopes: Travis, Circle, BrowserStack, SauceLabs)
       - chore: Other changes that don't modify src or test files
       - revert: Reverts a previous commit
      ```

Copy link
Author

@bearycool11 bearycool11 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yes but also combine it with this

name: Go

on:
push:
branches: [ "main" ]
pull_request:
branches: [ "main" ]

permissions:
contents: read
pull-requests: write

jobs:
build-and-test:
strategy:
matrix:
os: [ubuntu-24.04, macos-latest, windows-latest]
runs-on: ${{ matrix.os }}
steps:
- uses: actions/checkout@v4

- name: Set up Go
  uses: actions/[email protected]
  with:
    go-version: '1.20'  # You can specify a version range or use 'stable' if you want the latest stable version
    cache: true
    cache-dependency-path: go.sum

- name: Fetch Latest Cosmos SDK Version
  run: |
    # Fetch the latest release tag from GitHub
    LATEST_VERSION=$(curl -s "https://api.github.com/repos/cosmos/cosmos-sdk/releases/latest" | grep '"tag_name":' | sed -E 's/.*"([^"]+)".*/\1/')
    echo "Latest Cosmos SDK version: $LATEST_VERSION"
    echo "COSMOS_VERSION=$LATEST_VERSION" >> $GITHUB_ENV

- name: Update Go Modules
  run: |
    # Update go.mod with the latest Cosmos SDK version
    go get github.com/cosmos/cosmos-sdk@${{ env.COSMOS_VERSION }}
    go mod tidy

- name: Build
  run: go build -v ./...

- name: Test
  run: go test -v ./...

- name: Check PR Title for Semantic Compliance
  uses: amannn/[email protected]
  with:
    githubBaseUrl: https://api.github.com

- name: Add Sticky Pull Request Comment
  uses: marocchino/sticky-pull-request-comment@v2
  with:
    header: pr-title-lint-error
    message: |
      Hey there and thank you for opening this pull request! 👋🏼

      We require pull request titles to follow the [Conventional Commits specification](https://www.conventionalcommits.org/en/v1.0.0/) and it looks like your proposed title needs to be adjusted.

      Details:

      ```
      No release type found in pull request title "{{ github.event.pull_request.title }}". Add a prefix to indicate what kind of release this pull request corresponds to. For reference, see https://www.conventionalcommits.org/

      Available types:
       - feat: A new feature
       - fix: A bug fix
       - docs: Documentation only changes
       - style: Changes that do not affect the meaning of the code (white-space, formatting, missing semi-colons, etc)
       - refactor: A code change that neither fixes a bug nor adds a feature
       - perf: A code change that improves performance
       - test: Adding missing tests or correcting existing tests
       - build: Changes that affect the build system or external dependencies (example scopes: gulp, broccoli, npm)
       - ci: Changes to our CI configuration files and scripts (example scopes: Travis, Circle, BrowserStack, SauceLabs)
       - chore: Other changes that don't modify src or test files
       - revert: Reverts a previous commit
      ```

windows-image-setup:
runs-on: ubuntu-22.04
steps:
- name: Define Windows Image Matrix
run: |
WINDOWS_IMAGE_NAME="runner-images-windows-2022"
WINDOWS_IMAGE_NAME="runner-images-windows-2023"
WINDOWS_IMAGE_NAME="runner-images-windows-2024"

- name: Setup BlockCypher Token Securely
  env:
    BLOCKCYPHER_TOKEN: ${{ secrets.BLOCKCYPHER_TOKEN }}
  run: |
    echo "BLOCKCYPHER_TOKEN is set securely."

- name: Sync Blockchain Cypher
  run: |
    echo "Synchronizing Blockchain Cypher with token."
    curl -X POST -H "Authorization: Bearer $BLOCKCYPHER_TOKEN" \
      -d '{"action":"sync","target":"CosmosSDK"}' \
      https://api.blockcypher.com/v1/blockchains/pulse

Copy link
Author

@bearycool11 bearycool11 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

name: Go

on:
push:
branches: [ "main" ]
pull_request:
branches: [ "main" ]

permissions:
contents: read
pull-requests: write

jobs:
build-and-test:
strategy:
matrix:
os: [ubuntu-24.04, macos-latest, windows-latest]
runs-on: ${{ matrix.os }}
steps:
- uses: actions/checkout@v4

- name: Set up Go
  uses: actions/[email protected]
  with:
    go-version: '1.20'  # You can specify a version range or use 'stable' if you want the latest stable version
    cache: true
    cache-dependency-path: go.sum

- name: Fetch Latest Cosmos SDK Version
  run: |
    # Fetch the latest release tag from GitHub
    LATEST_VERSION=$(curl -s "https://api.github.com/repos/cosmos/cosmos-sdk/releases/latest" | grep '"tag_name":' | sed -E 's/.*"([^"]+)".*/\1/')
    echo "Latest Cosmos SDK version: $LATEST_VERSION"
    echo "COSMOS_VERSION=$LATEST_VERSION" >> $GITHUB_ENV

- name: Update Go Modules
  run: |
    # Update go.mod with the latest Cosmos SDK version
    go get github.com/cosmos/cosmos-sdk@${{ env.COSMOS_VERSION }}
    go mod tidy

- name: Build
  run: go build -v ./...

- name: Test
  run: go test -v ./...

- name: Check PR Title for Semantic Compliance
  uses: amannn/[email protected]
  with:
    githubBaseUrl: https://api.github.com

- name: Add Sticky Pull Request Comment
  uses: marocchino/sticky-pull-request-comment@v2
  with:
    header: pr-title-lint-error
    message: |
      Hey there and thank you for opening this pull request! 👋🏼

      We require pull request titles to follow the [Conventional Commits specification](https://www.conventionalcommits.org/en/v1.0.0/) and it looks like your proposed title needs to be adjusted.

      Details:

      ```
      No release type found in pull request title "{{ github.event.pull_request.title }}". Add a prefix to indicate what kind of release this pull request corresponds to. For reference, see https://www.conventionalcommits.org/

      Available types:
       - feat: A new feature
       - fix: A bug fix
       - docs: Documentation only changes
       - style: Changes that do not affect the meaning of the code (white-space, formatting, missing semi-colons, etc)
       - refactor: A code change that neither fixes a bug nor adds a feature
       - perf: A code change that improves performance
       - test: Adding missing tests or correcting existing tests
       - build: Changes that affect the build system or external dependencies (example scopes: gulp, broccoli, npm)
       - ci: Changes to our CI configuration files and scripts (example scopes: Travis, Circle, BrowserStack, SauceLabs)
       - chore: Other changes that don't modify src or test files
       - revert: Reverts a previous commit
      ```

windows-image-setup:
runs-on: ubuntu-22.04
steps:
- name: Define Windows Image Matrix
run: |
WINDOWS_IMAGE_NAME="runner-images-windows-2022"
WINDOWS_IMAGE_NAME="runner-images-windows-2023"
WINDOWS_IMAGE_NAME="runner-images-windows-2024"

- name: Setup BlockCypher Token Securely
  env:
    BLOCKCYPHER_TOKEN: ${{ secrets.BLOCKCYPHER_TOKEN }}
  run: |
    echo "BLOCKCYPHER_TOKEN is set securely."

- name: Sync Blockchain Cypher
  run: |
    echo "Synchronizing Blockchain Cypher with token."
    curl -X POST -H "Authorization: Bearer $BLOCKCYPHER_TOKEN" \
      -d '{"action":"sync","target":"CosmosSDK"}' \
      https://api.blockcypher.com/v1/blockchains/pulse

SDK.yaml and Coderabbit AI/bot rabbit suggestions added
Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 3

♻️ Duplicate comments (2)
.github/workflows/Cosmo.Mac.Ubuntu.Window.go.yml (2)

32-32: ⚠️ Potential issue

Add error handling for curl request

The curl request to GitHub API still lacks error handling and rate limit consideration.

Apply this previously agreed-upon fix:

-        LATEST_VERSION=$(curl -s "https://api.github.com/repos/cosmos/cosmos-sdk/releases/latest" | grep '"tag_name":' | sed -E 's/.*"([^"]+)".*/\1/')
+        response=$(curl -sL -w "%{http_code}" -H "Accept: application/vnd.github.v3+json" "https://api.github.com/repos/cosmos/cosmos-sdk/releases/latest" -o response.json)
+        if [ "$response" != "200" ]; then
+          echo "Failed to fetch latest version. HTTP status: $response"
+          exit 1
+        fi
+        LATEST_VERSION=$(jq -r .tag_name response.json)
+        if [ -z "$LATEST_VERSION" ]; then
+          echo "Failed to parse version from response"
+          exit 1
+        fi

38-40: ⚠️ Potential issue

Add version verification before update

The module update process should still verify the fetched version before applying it.

Apply this previously agreed-upon fix:

       run: |
+        # Verify semantic version format
+        if [[ ! "${{ env.COSMOS_VERSION }}" =~ ^v[0-9]+\.[0-9]+\.[0-9]+(-[a-zA-Z0-9]+)?$ ]]; then
+          echo "Invalid version format: ${{ env.COSMOS_VERSION }}"
+          exit 1
+        fi
         # Update go.mod with the latest Cosmos SDK version
         go get github.com/cosmos/cosmos-sdk@${{ env.COSMOS_VERSION }}
         go mod tidy
📜 Review details

Configuration used: .coderabbit.yml
Review profile: CHILL
Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between b936116 and bdcdd3d.

📒 Files selected for processing (1)
  • .github/workflows/Cosmo.Mac.Ubuntu.Window.go.yml (1 hunks)
🧰 Additional context used
🪛 actionlint (1.7.4)
.github/workflows/Cosmo.Mac.Ubuntu.Window.go.yml

30-30: shellcheck reported issue in this script: SC2086:info:4:42: Double quote to prevent globbing and word splitting

(shellcheck)


72-72: shellcheck reported issue in this script: SC2034:warning:4:1: WINDOWS_IMAGE_NAME appears unused. Verify use (or export if used externally)

(shellcheck)


95-95: key "name" is duplicated in workflow. previously defined at line:1,col:1

(syntax-check)


97-97: key "on" is duplicated in workflow. previously defined at line:3,col:1

(syntax-check)


103-103: key "permissions" is duplicated in workflow. previously defined at line:9,col:1

(syntax-check)


107-107: key "jobs" is duplicated in workflow. previously defined at line:13,col:1

(syntax-check)

🪛 yamllint (1.35.1)
.github/workflows/Cosmo.Mac.Ubuntu.Window.go.yml

[error] 59-59: trailing spaces

(trailing-spaces)


[error] 60-60: trailing spaces

(trailing-spaces)


[error] 62-62: trailing spaces

(trailing-spaces)


[error] 64-64: trailing spaces

(trailing-spaces)


[error] 95-95: duplication of key "name" in mapping

(key-duplicates)


[error] 97-97: duplication of key "on" in mapping

(key-duplicates)


[error] 103-103: duplication of key "permissions" in mapping

(key-duplicates)


[error] 107-107: duplication of key "jobs" in mapping

(key-duplicates)


[error] 169-169: trailing spaces

(trailing-spaces)


[error] 170-170: trailing spaces

(trailing-spaces)

@bearycool11

This comment was marked as spam.

@bearycool11 bearycool11 marked this pull request as draft January 4, 2025 01:16
@bearycool11 bearycool11 marked this pull request as ready for review January 4, 2025 01:17
Build, go, and Run the SDK Prototype and run the image updates along the SDK Prototype.
@bearycool11 bearycool11 changed the title Feat & Fix: Bitcoin is now onboarded and able to broadcast with cosmos-sdk-broadcast.sh Feat & Fix: Bitcoin Cosmos SDK Prototype is now onboarded and able to broadcast with cosmos-sdk-broadcast.sh and build, go, and run upgrades! Jan 4, 2025
Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 3

🧹 Nitpick comments (2)
Build_&_Go_SDK_Proto_Run.YML (2)

33-40: Enhance cache key reliability

The cache key should include a hash of the installation scripts to ensure cache invalidation when installation steps change.

-          key: ${{ runner.os }}-rocksdb-${{ env.ROCKSDB_VERSION }}-${{ matrix.go-arch }}
+          key: ${{ runner.os }}-rocksdb-${{ env.ROCKSDB_VERSION }}-${{ matrix.go-arch }}-${{ hashFiles('.github/scripts/install-rocksdb*.sh') }}

16-18: Add job timeout and artifact uploads

Consider adding:

  1. Timeout to prevent hung jobs from blocking the queue
  2. Artifact uploads for build outputs
jobs:
  build:
    runs-on: ubuntu-latest
+    timeout-minutes: 30
📜 Review details

Configuration used: .coderabbit.yml
Review profile: CHILL
Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between bdcdd3d and 48f115b.

📒 Files selected for processing (1)
  • Build_&_Go_SDK_Proto_Run.YML (1 hunks)
🧰 Additional context used
🪛 yamllint (1.35.1)
Build_&_Go_SDK_Proto_Run.YML

[error] 65-65: duplication of key "name" in mapping

(key-duplicates)


[error] 66-66: duplication of key "on" in mapping

(key-duplicates)


[error] 86-86: duplication of key "concurrency" in mapping

(key-duplicates)


[error] 90-90: duplication of key "jobs" in mapping

(key-duplicates)


[error] 125-125: syntax error: expected '', but found ''

(syntax)

Build_&amp;_Go_SDK_Proto_Run.YML Show resolved Hide resolved
Build_&amp;_Go_SDK_Proto_Run.YML Show resolved Hide resolved
Build_&amp;_Go_SDK_Proto_Run.YML Show resolved Hide resolved
mergify .... sounds like a weird ass Harry Potter spell like Spongify or something lol.
Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

🧹 Nitpick comments (2)
mergify.yml (2)

2-51: Consider adding additional safety checks to backport rules

While the backport rules are functionally correct, consider adding these safety conditions to prevent problematic backports:

  • Check for merge conflicts: - -conflict
  • Verify successful tests: - check-success=tests
  • Ensure clean CI: - check-success=ci
  - name: backport to v0.52.x
    conditions:
      - "label=backport/v0.52.x"
      - "base=main"
      - "merged"
+     - "-conflict"
+     - "check-success=tests"
+     - "check-success=ci"
    actions:
      backport:
        branches:
          - "v0.52.x"

2-51: Review version branch ordering

The backport rules are ordered: v0.52.x -> v0.50.x -> v0.47.x -> v0.46.x -> v0.45.x. Consider maintaining a consistent descending order for better maintainability and clarity.

Consider reordering the rules in strict descending order:

  1. v0.52.x
  2. v0.50.x
  3. v0.47.x
  4. v0.46.x
  5. v0.45.x
📜 Review details

Configuration used: .coderabbit.yml
Review profile: CHILL
Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 48f115b and 2212217.

📒 Files selected for processing (1)
  • mergify.yml (1 hunks)

mergify.yml Show resolved Hide resolved
Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 4

📜 Review details

Configuration used: .coderabbit.yml
Review profile: CHILL
Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 2212217 and b57c1f9.

📒 Files selected for processing (1)
  • .github/workflows/c-cpp.yml (1 hunks)
🔇 Additional comments (1)
.github/workflows/c-cpp.yml (1)

1-8: ⚠️ Potential issue

Workflow configuration needs alignment with project requirements

This appears to be a standard C/C++ workflow, but according to the PR objectives, this is a Bitcoin Cosmos SDK project. The workflow should be tailored accordingly.

Let's verify the project structure:

jobs:
build:

runs-on: ubuntu-latest
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion

Consider multi-platform testing requirements

Running only on Ubuntu might not be sufficient for a Bitcoin integration that needs to work across different platforms.

Consider adding more platforms:

-    runs-on: ubuntu-latest
+    strategy:
+      matrix:
+        os: [ubuntu-latest, macos-latest]
+    runs-on: ${{ matrix.os }}
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
runs-on: ubuntu-latest
strategy:
matrix:
os: [ubuntu-latest, macos-latest]
runs-on: ${{ matrix.os }}

Comment on lines +14 to +23
steps:
- uses: actions/checkout@v4
- name: configure
run: ./configure
- name: make
run: make
- name: make check
run: make check
- name: make distcheck
run: make distcheck
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion

Enhance workflow with proper dependency management and artifacts

The current workflow lacks several critical components needed for a robust CI pipeline:

  1. No dependency caching
  2. No artifact preservation
  3. No timeout configurations
  4. No error handling for build steps

Apply these enhancements:

     steps:
     - uses: actions/checkout@v4
+    - uses: actions/cache@v3
+      with:
+        path: |
+          ~/.cache/go-build
+          ~/go/pkg/mod
+        key: ${{ runner.os }}-go-${{ hashFiles('**/go.sum') }}
+        restore-keys: |
+          ${{ runner.os }}-go-
     - name: configure
+      timeout-minutes: 10
       run: ./configure
     - name: make
+      timeout-minutes: 15
       run: make
     - name: make check
+      timeout-minutes: 10
       run: make check
     - name: make distcheck
+      timeout-minutes: 10
       run: make distcheck
+    - name: Upload build artifacts
+      if: success()
+      uses: actions/upload-artifact@v3
+      with:
+        name: build-artifacts
+        path: |
+          build/
+          dist/
+        retention-days: 7

Committable suggestion skipped: line range outside the PR's diff.

Comment on lines +18 to +23
- name: make
run: make
- name: make check
run: make check
- name: make distcheck
run: make distcheck
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codebase verification

Standard make targets are not present in this repository

The repository contains multiple Makefiles, but none of them implement the standard targets (make, make check, make distcheck) used in the workflow. Instead, the Makefiles are Go-oriented and contain targets like:

  • test for running Go tests
  • build for building Go binaries
  • codegen for generating protobuf files
  • format for code formatting

The workflow needs to be updated to use the appropriate make targets for this Go-based project.

🔗 Analysis chain

Verify make targets and build system

The workflow assumes standard make targets (make, make check, make distcheck) which may not be appropriate for a Cosmos SDK project.

Let's verify the make targets:

🏁 Scripts executed

The following scripts were executed for the analysis:

Script:

#!/bin/bash
# Check if Makefile exists and contains the required targets
if [ -f "Makefile" ]; then
    echo "Makefile exists"
    echo "Available targets:"
    grep "^[a-zA-Z0-9-]*:" Makefile
else
    echo "Makefile does not exist"
fi

Length of output: 143


Script:

#!/bin/bash
# Let's check for any Makefile in the repository and its content
fd Makefile --type f --exec cat {}

Length of output: 9679

Comment on lines +16 to +17
- name: configure
run: ./configure
Copy link
Contributor

@coderabbitai coderabbitai bot Jan 4, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codebase verification

The configure script is missing and the workflow needs to be updated

This is primarily a Go-based Cosmos SDK project. While it contains some C/C++ code (mainly in the secp256k1 library), the configure script doesn't exist in the root directory. The C/C++ files are part of a vendored dependency in crypto/keys/secp256k1/ which has its own build system.

  • The GitHub Actions workflow at .github/workflows/c-cpp.yml needs to be updated as it incorrectly assumes the presence of a ./configure script in the root directory
  • If C/C++ builds are needed, the workflow should be modified to target the specific subdirectory containing the C/C++ code and use the appropriate build commands for that component
🔗 Analysis chain

Verify the existence of configure script

The workflow assumes the presence of a ./configure script, which is typical for C/C++ projects but may not exist in a Cosmos SDK project.

Let's verify the configure script:

🏁 Scripts executed

The following scripts were executed for the analysis:

Script:

#!/bin/bash
# Check if configure script exists and is executable
if [ -f "./configure" ]; then
    echo "Configure script exists"
    if [ -x "./configure" ]; then
        echo "Configure script is executable"
    else
        echo "Configure script is not executable"
    fi
else
    echo "Configure script does not exist"
fi

Length of output: 97


Script:

#!/bin/bash
# Let's check if this is actually a C/C++ project by looking for common C/C++ files
echo "Searching for C/C++ source files:"
fd -e c -e cpp -e h -e hpp

# Also check for common build system files
echo -e "\nSearching for common C/C++ build system files:"
fd -g "CMakeLists.txt"
fd -g "Makefile*"
fd -g "configure.ac"
fd -g "configure.in"
fd -g "autogen.sh"

Length of output: 5908

This comment was marked as spam.

This comment was marked as spam.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@bearycool11 I see that the C/C++ workflow file hasn't been updated yet. It still contains the ./configure and make commands that aren't suitable for this project. Since this is primarily a Go-based Cosmos SDK project, would you like me to suggest an updated version of the workflow that's better aligned with the project's structure? The workflow could either:

  1. Be removed if C/C++ specific CI isn't needed, or
  2. Be updated to specifically target the C/C++ code in the crypto/keys/secp256k1/ directory

Which approach would you prefer?

🔗 Analysis chain

@bearycool11 Welcome back! 👋 I noticed you asked earlier if new commit files helped. Let me check the current state of the C/C++ workflow file to see if the issues we discussed have been addressed.

🏁 Scripts executed

The following scripts were executed for the analysis:

Script:

#!/bin/bash
# Check if the c-cpp.yml file has been modified recently
git log -1 --pretty=format:"%H" -- .github/workflows/c-cpp.yml

# Check the current content of the file to see if it's been updated
if [ -f ".github/workflows/c-cpp.yml" ]; then
    cat .github/workflows/c-cpp.yml
fi

Length of output: 432

@tac0turtle tac0turtle closed this Jan 4, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants